-
Notifications
You must be signed in to change notification settings - Fork 756
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Keras model exported as TensorFlow model doesn't work with TensorFlow serving #38
Comments
Could this be an issue with different versions of TF used for training and serving? |
It's unlikely. I ensured that both training and serving were done using tensorflow version 1.5.x I did have a different error when I trained using TF 1.4 and served using TF 1.5. |
The TF serving images in Kubeflow are all 1.4 currently. I haven't tried creating a saved model with Keras in TF 1.5, but from my experience, creating an Estimator with TF 1.5 caused failures in TF serving 1.4. Not sure if Keras suffers from a similar issue with compatibility, although your case is the reverse. Did you try 1.4 for both training and serving? I was able to run this one successfully with TF 1.4 training and serving. Here's the client to call the server: |
The notebook link above is broken after a github update. I have an updated notebook here which includes not only creating servable models, but also some guidance on unit testing the input-output API: https://github.com/google-aai/tf-serving-k8s-tutorial/blob/master/keras_training_to_serving_solution.ipynb |
Created an issue: #48 |
I don't think we ever solved this, did we? Have there been any updates that would make this straightforward? |
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions. |
I am using the keras model as defined in this tutorial: https://github.com/hamelsmu/Seq2Seq_Tutorial/blob/master/notebooks/Tutorial.ipynb
I exported the encoder model using
extract_encoder_model
and exported it as a Tensorflow model. When used with TensorFlow serving, I get the following errorThe text was updated successfully, but these errors were encountered: