Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support binary input fields #2

Open
Andrew-Crosby opened this issue May 3, 2022 · 1 comment
Open

Support binary input fields #2

Andrew-Crosby opened this issue May 3, 2022 · 1 comment

Comments

@Andrew-Crosby
Copy link

JSON does not natively support byte data, so I don't think it's possible to deploy a model with a binary input field.

@bloomonkey
Copy link
Contributor

Thanks for reporting @Andrew-Crosby. In an ideal world, how would you want this to be supported? The options that occur to me are:

POST the binary as a simple blob (i.e. with Content-Type: application/octet-stream instead of application/json)

For: simplest to implement in model and client(s)
Against: API does not directly reflect model input schema
Against: precludes support for models with multiple inputs

Assume binary data columns/fields will be sent as base64 encoded

For: retains JSON as the API and Content-Type - and so supports multiple inputs
Against: API does not directly reflect model input schema

Explicitly state that binary is not supported, suggest that MLflow model be modified to decode a base64 encoded column

For: retains JSON as the API and Content-Type - supports multiple inputs
Against: forces MLflow model to address a concern specific to real-time inference up-front

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants