Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Export tensorflow lite model don't use onnx model input names and output names #1047

Open
linmeimei0512 opened this issue Oct 27, 2022 · 1 comment

Comments

@linmeimei0512
Copy link

My environment is:

  • Python version: 3.6
  • ONNX version: 1.11.0
  • ONNX-TF version: 1.10.0
  • Tensorflow-gpu version: 2.5.0

My final goal:
ONNX model convert to tflite.

Question:
My ONNX model input name is [input], output name is [output1, output2]
2022-10-27 17-03-40 的螢幕擷圖

ONNX convert to Tensorflow is use SavedModel format in Tensorflow 2.x.

import onnx
from onnx_tf.backend import prepare

onnx_model = onnx.load(onnx_model_path)
onnx_tf_exporter = prepare(onnx_model)
onnx_tf_exporter.export_graph(tensorflow_model_output_path)

I display output model (saved_model.pb) in Netron, input name is not equal to ONNX.
2022-10-27 17-02-48 的螢幕擷圖

Then I convert the SavedModel to tflie.

import tensorflow as tf

converter = tf.lite.TFLiteConverter.from_saved_model(tensorflow_model_path)
converter.optimizations = [tf.lite.Optimize.DEFAULT]
tflite_model = converter.convert()
with open(tensorflow_lite_model_output_path, 'wb') as f:
    f.write(tflite_model)

Tflie model input name is equal to SavedModel [serving_default_input:0], output name is [StatefulPartitionedCall:1, StatefulPartitionedCall:1]
2022-10-27 17-18-23 的螢幕擷圖

Please how can I make tflite input name and output name equal to ONNX?

@PINTO0309
Copy link

duplicate
#984

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants