Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Torchserve REST API vs Flask REST API #581

Closed
MichaelMMeskhi opened this issue Aug 1, 2020 · 3 comments
Closed

Torchserve REST API vs Flask REST API #581

MichaelMMeskhi opened this issue Aug 1, 2020 · 3 comments
Assignees
Labels
triaged_wait Waiting for the Reporter's resp Why_should_I_use_serve Questions around the competitiveness of the product

Comments

@MichaelMMeskhi
Copy link

I am trying to serve a model on a server. Apart from doing inference, I want to be able to send inference results to a SQL database as well as back to the request. Basically, I want to add customer API methods. Is this done via handler.py or should I just use Flask to get this done? What are the major differences between TS and Flask?

@MichaelMMeskhi MichaelMMeskhi changed the title Torchserve REST API vs Flask REST APIT Torchserve REST API vs Flask REST API Aug 1, 2020
@harshbafna
Copy link
Contributor

@MichaelMMeskhi: There are a couple of ways with which you can handle this scenario

  • Handle at the client end: You can write a simple script to collect the inference response returned by TorchServe and execute the required SQL query.

  • Handle at the server end: Execute the required SQL query in the post process method of your handler.

    • This will require your database-specific python module to be installed on the same machine where TorchServe is running. You can use the new feature of TorchServe which allows users to supply a model-specific requirements.txt file containing a list of 3rd-party python dependency packages. Refer documentation for more details. Note that this feature is only available in the latest master as of now and will be available in the next release.

@harshbafna harshbafna self-assigned this Aug 1, 2020
@harshbafna harshbafna added the triaged_wait Waiting for the Reporter's resp label Aug 1, 2020
@MichaelMMeskhi
Copy link
Author

@harshbafna Thank you for your feedback! I am looking to handle this on the server end. So basically, customizing the post process method to whatever I need to get done after prediction is the best option I see.

@harshbafna
Copy link
Contributor

Closing as the question has been answered.

@msaroufim msaroufim added the Why_should_I_use_serve Questions around the competitiveness of the product label May 13, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
triaged_wait Waiting for the Reporter's resp Why_should_I_use_serve Questions around the competitiveness of the product
Projects
None yet
Development

No branches or pull requests

3 participants