-1

I have a working model in Tensorflow that works fine with Python. It is now a saved model and attempted to convert to TensorflowJS.

A converted model appears to not work correctly. The mean value from the TensorMap appears to be 0 and has a shape of [0], despite the input being [1,96,192,3].

The conversion was done as follows....

tensorflowjs_converter --input_format=tf_saved_model ~/Projects/models/model_0 ~/Projects/modelsjs/model_0

This works fine and it also appears to load ok. However, when it comes to predicting, errors are thrown and any advice would be appreciated.

<script>
    async function handleButtonClick(){

       for(var i=0; i<1;i++)
       {
           var t0 = performance.now();
           console.log("Loading - model_"+i);
           var inputTensor = tf.tensor4d(input);
           var model = await tf.loadGraphModel('/modelsjs/model_'+i+'/model.json');
           var poutput = model.predict(inputTensor);
       }
</script>

The error appears as follows.

graph_model.ts:213 Uncaught (in promise) Error: The model contains control flow or dynamic shape ops, please use executeAsync method
    at t.execute_ (graph_model.ts:213)
    at t.predict (graph_model.ts:169)
    at (index):157
    at engine.ts:156
    at t.scopedRun (engine.ts:167)
    at t.tidy (engine.ts:153)
    at Object.t.tidy (environment.ts:186)
    at handleButtonClick ((index):156)

As per the above error, the prediction was attempted using executeAsync but produces the error relevant to this question.

<script>
    async function handleButtonClick(){

       for(var i=0; i<1;i++)
       {
           var t0 = performance.now();
           console.log("Loading - model_"+i);
           var inputTensor = tf.tensor4d(input);
           var model = await tf.loadGraphModel('/modelsjs/model_'+i+'/model.json');
           console.log("Load model_" + i + "took " + (t1 - t0) + " milliseconds.");
           const res = await model.executeAsync(inputTensor);
       }
</script>

The Error appears as follows. And appears to be related to the $mean value from the Tensormap. This value is [0]

broadcast_util.ts:81 Uncaught (in promise) Error: Operands could not be broadcast together with shapes 1,12,24,64 and 0.
    at un (broadcast_util.ts:81)
    at new kn (batchnorm_packed_gpu.ts:32)
    at t.batchNormalization (backend_webgl.ts:869)
    at Bt.engine.runKernel.$x (batchnorm.ts:344)
    at engine.ts:206
    at t.scopedRun (engine.ts:167)
    at t.runKernel (engine.ts:202)
    at $a (batchnorm.ts:343)
    at batchNorm (operation.ts:46)
    at xy (normalization_executor.ts:31)

With some digging in developer tools, it looks like the error begins here.....

enter image description here

Setting 'strict' to true, the model still loads and does not throw and error.

var model = await 
        tf.loadGraphModel('/modelsjs/model_0/model.json', {onProgress:onProgressCallback, strict:true});

Sadly, i am unable to share the model as it is proprietary.

4

1 回答 1

0

模型是错误的。

最初是转换为 Tensorflow SavedModel 的 Keras 模型。将此模型转换为 Tensorflowjs 不起作用。

但是,将原始模型从 Keras 转换为 Tensorflowjs 是可行的。

吸取的教训,不要把你的模型混得这么多!

于 2019-09-05T02:01:36.947 回答