Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Docker image with prebuilt tensorflow_model_server, ready to run. #513

Closed
stianlp opened this issue Jul 7, 2017 · 9 comments
Closed

Docker image with prebuilt tensorflow_model_server, ready to run. #513

stianlp opened this issue Jul 7, 2017 · 9 comments

Comments

@stianlp
Copy link

stianlp commented Jul 7, 2017

It looks like the most common way to build and deploy models is to build the tensorflow_model_server locally. If using docker, one would follow these steps:

  1. Get the Dockerfile.devel from somewhere
  2. Build the image
  3. Run image and build bazel build -c opt //tensorflow_serving/model_servers:tensorflow_model_server
  4. (Commit the container to a new image / serve models from the container where tf serving model server is now build)

Wouldn't it be better if there was an image available on dockerhub that is ready to just run an provide the model path as a parameter.

Something like: docker run -d -p <port>:<port> -v /path/to/model:/path/to/model/should/be/in/container image-where-tf-model-server-is-built

The CMD instruction for this image is something like: bazel-bin/tensorflow_serving/model_servers/tensorflow_model_server --port=<your-port> --model_name=<you-model-name> --model_base_path=<your-base-path>

<your-*> are sent as parameters to the docker run command.

Is there a good reason why such an image does not exist already? I mean, the only thing needed to serve a model is the built model_server, or am I missing something?

@chrisolston
Copy link
Contributor

Thanks for the suggestion. We "get" that this would be useful. We are a small team and thus far have not had the resources to maintain an offering along those lines. We are considering whether we can support something like that going forward, but it's a complicated question on our side.

@stianlp
Copy link
Author

stianlp commented Jul 10, 2017

Alright! Thanks for taking it into consideration :)

@bhack
Copy link

bhack commented Jul 23, 2017

@stianlp It could be very useful but we need to start to think to serving and ecosystem as a community contribution driven projects. As @chirsolston confirmed the resources allocated here are very small and I strongly belive in #311 (comment)

@stianlp
Copy link
Author

stianlp commented Jul 31, 2017

Thanks for feedback folks! We built a docker image at work, and I decided to open source it. So if anyone is in need of an image to quickly be able to host a model. Feel free to pull. It's not working with GPUs or anything, but it works as a simple way to host your models.

https://hub.docker.com/r/epigramai/model-server/

@gauravkaila
Copy link

gauravkaila commented Dec 5, 2017

Having faced issues in compiling TensorFlow serving on the docker container, I built a docker image for CPU version of TF-serving. You can pull the image by (available on: https://hub.docker.com/r/gauravkaila/tf_serving_cpu/)

docker pull gauravkaila/tf_serving_cpu

To start the container,

docker run --name=tf_container_cpu -it -p 9000:9000 gauravkaila/tf_serving_cpu

To start the server (within the container),

bazel-bin/tensorflow_serving/model_servers/tensorflow_model_server --port=9000 --model_name=${model_name} --model_base_path=${absolute_path_of_models} &> ${log_file} &

UPDATE

For GPU support, use the following image,

docker pull gauravkaila/tf_serving_gpu

@jlewi
Copy link

jlewi commented Feb 24, 2018

Kubeflow is planning on publishing and maintaining TFServing docker images as part of its prediction story.

@marceloboeira
Copy link

@jlewi
Copy link

jlewi commented May 31, 2018

The images published by Kubeflow are available at

gcr.io/kubeflow-images-public

We are publishing images for TF versions 1.4 to 1.8 and publish CPU and GPU images.

We aim to support new versions of TF serving as they become available but we provide no guarantees about how soon after TF serving releases that they will be published.

The source is here
https://github.com/kubeflow/kubeflow/tree/master/components/k8s-model-server/images

@gautamvasudevan
Copy link
Collaborator

We now have these as both dockerfiles and published to https://hub.docker.com/r/tensorflow/serving

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

7 participants