-
Notifications
You must be signed in to change notification settings - Fork 24
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
crate::core::Core::read_model_from_buffer example? #136
Comments
Thanks for the report; take a look at #137 and let me know if that explains things well-enough. |
Thanks for the swift reply! The example is great but still does not cover my particular use case. My goal is to acquire a use openvino::Core;
use std::fs;
fn main() -> anyhow::Result<()> {
let mut core = Core::new()?;
// imagine graph and weights are instead downloaded over network
let graph = fs::read("model.xml")?;
let weights = fs::read("model.bin")?;
let weights_tensor = todo!(); // change me
let model = core.read_model_from_buffer(&graph, Some(&weights_tensor))?;
Ok(())
} I guess your example works, but then how do I get the shape of the weights tensor, should I parse the graph file? |
Ah, I see the missing link. What I put up in #137 is what you need in order to fill out the I'll merge #137 since that is helpful regardless but maybe we should use your snippet as a doc example on |
Okay, so I wrote a small example test for reading a model from buffers: #138 But I think I stumbled into an upstream bug in the newer versions on openvino: passing But it also shows that having an honest-to-god-not-failing test is valuable here, so this is where I think it belongs |
It is not very obvious how to create a model from memory.
Specifically, I couldn't figure out how to make a
Tensor
from aVec<u8>
read from a file.The most obvious way to achieve that is
Tensor::from_ptr
, but it is onlypub(crate)
The text was updated successfully, but these errors were encountered: