-
Notifications
You must be signed in to change notification settings - Fork 19.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[POC][OV] Support OpenVINO as Keras 3 backend #19727
base: master
Are you sure you want to change the base?
[POC][OV] Support OpenVINO as Keras 3 backend #19727
Conversation
Signed-off-by: Kazantsev, Roman <[email protected]>
Thanks for your pull request! It looks like this may be your first contribution to a Google open source project. Before we can look at your pull request, you'll need to sign a Contributor License Agreement (CLA). View this failed invocation of the CLA check for more information. For the most up to date status, view the checks section at the bottom of the pull request. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for the PR!
@@ -81,6 +81,32 @@ def __init__(self, inputs, outputs, name=None): | |||
self._nodes_by_depth = nodes_by_depth | |||
self._operations = operations | |||
self._operations_by_depth = operations_by_depth | |||
if backend() == "openvino": |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We should not backend-specific modifications to shared abstractions.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hi @fchollet, OpenVINO does not support eager mode for inference. We should build OV graph preliminary and run inference for the whole graph. So I decided to construct a graph in init
and use it for call
. Please propose how to avoid backend-specific code here due to this OV specific.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Just disable what doesn't work, with a clear error message, at the level of the openvino backend. If openvino is only usable via evaluate
and predict
, that's ok.
Though tbh since this backend is inference only and doesn't have eager support, it sounds like maybe it should be a backend, but instead should be an export format in model.export()
.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@fchollet, please clarify. Your point is to exclude this backend-specific code and configure the backend via _...
variables. This configuration should tell Keras 3 that OpenVINO only works in non-eager mode. Am I correct? I just want ops helpers to work only with symbolic tensors ov::Output
or ov::Node
instances (for graph construction). And I want this graph to be constructed and compiled for device only once for one device and input shape specification. Is it possible to do in Keras 3? Should it be done in OpenVINOTrainer class? And what methods for graph construction and compilation should be implemented in that class?
Or your point only to implement export
method? It will allow only to infer using OpenVINO API after model loading and not Keras 3. Please correct me.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
So there are two entirely separate options:
- Make an openvino backend. If we do this it should have roughly the same scope of functionality as the numpy backend. It should "fit" the backend format. I am not sure this is feasible if openvino has no support for eager execution.
- Make
model.export()
/ExportArchive
able to save a model in the openvino format (the model would be coming from either Keras + JAX, or Keras + torch, of Keras + TF). That was folks can train their models in another backend, then export it to openvino.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Here is an example of how compiled_model
is created:
import openvino.runtime.opset9 as ov
from openvino.runtime import Model, Core
# create a model with eltwise divide
x = ov.parameter([1], name="x", dtype=np.int32)
y = ov.parameter([1], name="y", dtype=np.int32)
divide = ov.divide(x, y)
ov_model = Model([divide], [x, y], "model")
# compile the model for CPU device
core = ov.Core()
compiled_model = core.compile_model(ov_model, 'CPU')
At the first step, we build ov::Model
ov_model object and compile for CPU then.
@fchollet, may be I can set up a meeting with you on Discord (@rkazants on Discord) to discuss further steps and potential solution?
Really appreciate your time and help,
Roman
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
And then you call the compiled_model
on numpy data?
That sort of logic would belong in the Trainer
class. You can try prototyping an OpenVINO Trainer that works like this.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
And then you call the
compiled_model
on numpy data?
Exactly, it calls compiled_model
on numpy data for prediction
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If we implement a backend like this, it won't be usable eagerly, but only through Model.evaluate()
and Model.predict()
. That's quite limiting. It's still feasible though. We won't be able to test it on CI, since most of our tests need eager mode, instead we'd need a new set of integration tests specifically for this backend.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
For inference such functionality is sufficient) Let me try to do this.
Best regards,
Roman
@@ -289,6 +291,9 @@ def __init__( | |||
self._convert_input_args = True | |||
# Whether to allow non-tensors as positional arguments in `call()`. | |||
self._allow_non_tensor_positional_args = False | |||
if backend.backend() == "openvino": |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This should not be backend-specific.
Codecov ReportAttention: Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## master #19727 +/- ##
==========================================
- Coverage 78.91% 77.83% -1.08%
==========================================
Files 510 522 +12
Lines 48590 49303 +713
Branches 8960 9033 +73
==========================================
+ Hits 38344 38377 +33
- Misses 8391 9071 +680
Partials 1855 1855
Flags with carried forward coverage won't be shown. Click here to find out more. ☔ View full report in Codecov by Sentry. |
Hi @rkazants Can you please resolve the conflicts? Thank you! |
I am working on this PR. I am trying to resolve comments. Best regards, |
Hi @rkazants Any update on this PR? Please. Thank you! |
This PR is stale because it has been open for 14 days with no activity. It will be closed if no further activity occurs. Thank you. |
I am just back from vacation. I plan to continue from the next week. |
This PR is stale because it has been open for 14 days with no activity. It will be closed if no further activity occurs. Thank you. |
Hi @rkazants Any update on this PR? Please. Thank you! |
Signed-off-by: Kazantsev, Roman <[email protected]>
Signed-off-by: Kazantsev, Roman <[email protected]>
Signed-off-by: Kazantsev, Roman <[email protected]>
Signed-off-by: Kazantsev, Roman <[email protected]>
Signed-off-by: Kazantsev, Roman <[email protected]>
Signed-off-by: Kazantsev, Roman <[email protected]>
Signed-off-by: Kazantsev, Roman <[email protected]>
Details: Support OpenVINO as Keras 3 backend. This is inference-only backend. In order to switch on this, define environment variable as follows:
os.environ["KERAS_BACKEND"] = "openvino"