Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Lot of people have this issue] The model can not "really" do inference; Loading issue is solved. #10990

Closed
JonathanSum opened this issue Mar 24, 2022 · 28 comments
Assignees
Labels
platform:mobile issues related to ONNX Runtime mobile; typically submitted using template

Comments

@JonathanSum
Copy link

JonathanSum commented Mar 24, 2022

#10302
Not only this was mentioned. There is also other users reported this issues
https://www.npmjs.com/package/onnxruntime-react-native
I tried to convert it to ort file format. And it doesn't work.

image
Reproduce repo: https://github.com/JonathanSum/RN_Testing
image

@RandySheriffH RandySheriffH added the platform:mobile issues related to ONNX Runtime mobile; typically submitted using template label Mar 24, 2022
@JonathanSum
Copy link
Author

I am not sure how these 3 parts doing. For example, the build folder, is it a folder created by the user?
I hope it will be a video that explains more about it.
image

@JonathanSum
Copy link
Author

image
This folder does not exist. The gradlew

@JonathanSum
Copy link
Author

reproduable example https://github.com/JonathanSum/RN_Testing

@JonathanSum JonathanSum changed the title The "onnxruntime-react-native" just can not find the model or load the model [Lot of people have this issue] The "onnxruntime-react-native" just can not find the model or load the model Mar 25, 2022
@JonathanSum
Copy link
Author

JonathanSum commented Mar 25, 2022

Is it possible that we will have a video step by step on how to set up the React Native ONNX model loading and inference?

@JonathanSum
Copy link
Author

image

@JonathanSum
Copy link
Author

Same Issue: #9594

@JonathanSum JonathanSum changed the title [Lot of people have this issue] The "onnxruntime-react-native" just can not find the model or load the model [Lot of people have this issue] The "onnxruntime-react-native" just can not find the model or load the model; Example here even used Java code to get over it. Mar 25, 2022
@JonathanSum JonathanSum changed the title [Lot of people have this issue] The "onnxruntime-react-native" just can not find the model or load the model; Example here even used Java code to get over it. [Lot of people have this issue] The "onnxruntime-react-native" just can not find the model or load the model; Repo code even uses Java code to hide this issue. Mar 25, 2022
@JonathanSum JonathanSum changed the title [Lot of people have this issue] The "onnxruntime-react-native" just can not find the model or load the model; Repo code even uses Java code to hide this issue. [Lot of people have this issue] The "onnxruntime-react-native" just can not find the model or load the model; Repo code even uses Java code to get over this issue. Mar 25, 2022
@JonathanSum JonathanSum reopened this Mar 25, 2022
@JonathanSum JonathanSum changed the title [Lot of people have this issue] The "onnxruntime-react-native" just can not find the model or load the model; Repo code even uses Java code to get over this issue. [Lot of people have this issue] The "onnxruntime-react-native" just can not find the model or load the model; Mar 25, 2022
@JonathanSum JonathanSum reopened this Mar 26, 2022
@JonathanSum JonathanSum reopened this Mar 26, 2022
@JonathanSum JonathanSum reopened this Mar 26, 2022
@FFPTech-Sebastien
Copy link

Hey ! Did you find any solutions?

@JonathanSum JonathanSum reopened this Mar 26, 2022
@JonathanSum JonathanSum reopened this Mar 26, 2022
@JonathanSum JonathanSum reopened this Mar 26, 2022
@JonathanSum JonathanSum changed the title [Lot of people have this issue] The "onnxruntime-react-native" just can not find the model or load the model; [Lot of people have this issue] The "onnxruntime-react-native" just can not find the model or load the model, The model can not do inference; Mar 27, 2022
@FFPTech-Sebastien
Copy link

Could you please provide the code where you're able to load the model?

@JonathanSum
Copy link
Author

JonathanSum commented Mar 29, 2022

I deleted my post. But I still want to repost it.
I was successful to load the model, but the infere part got me.
It looks like the doc said ONNX for RN does not support unsigned tensor type. But I guess javascript does not have the unsigned datatype as C++'s singed datatype. I tried to input a matrix will all one into the mist.ort model, but it failed.

       let modelPath = null;
        let session = null;
        try {
            modelPath = await ONNX.getLocalModelPath();
            console.log('found file path out');
            console.log(modelPath);
            session = await InferenceSession.create(modelPath);

            const dims = [1, 1, 28, 28];
            // const dims = [1, 1, 28, 28];

            // const float32Data = new Float32Array(dims[1] * dims[2] * dims[3]);
            const float32Data = new Float32Array(784);

            for (let i = 0; i < float32Data.length; i++) {
                float32Data[i] += 1.0;
            }

            const inputTensor = new Tensor('float32', float32Data, dims);

            console.log(session.inputNames);

            console.log(session.outputNames);
            console.log('started');
            // console.log(float32Data.length);
            // console.log(inputTensor.data);
            console.log(inputTensor.dims);
            console.log(inputTensor.size);
            console.log('ended');

            const feeds = {};

            feeds[session.inputNames[0]] = inputTensor;
            const output = session.run(feeds);
            console.log(output);
        } catch (e) {
            console.log('Error, did not found the file path out');
            console.error(e);
        }


        console.log(DEFAULT_EVENT_NAME);

    };

@JonathanSum
Copy link
Author

image
2022-03-30 10:48:58.264 9869-25771/com.awesomeproject D/Duration: createInputTensor: 6
2022-03-30 10:48:58.413 9869-25771/com.awesomeproject A/.awesomeprojec: runtime.cc:655] native: #20 pc 00000000005a45e4 /apex/com.android.art/lib64/libart.so (art::Thread::CreateCallback(void*)+1308)
runtime.cc:655] native: #21 pc 00000000000b0048 /apex/com.android.runtime/lib64/bionic/libc.so (__pthread_start(void*)+64)
runtime.cc:655] native: #22 pc 00000000000503c8 /apex/com.android.runtime/lib64/bionic/libc.so (__start_thread+64)
runtime.cc:655] at com.facebook.flipper.android.EventBase.loopForever(Native method)
runtime.cc:655] at com.facebook.flipper.android.FlipperThread.run(FlipperThread.java:31)
runtime.cc:655]
runtime.cc:655] "FlipperConnectionThread" prio=4 tid=22 Native
runtime.cc:655] | group="" sCount=1 dsCount=0 flags=1 obj=0x13c00d90 self=0xb4000072eeaaf760
runtime.cc:655] | sysTid=25743 nice=10 cgrp=default sched=0/0 handle=0x710e7abcc0
runtime.cc:655] | state=S schedstat=( 5708071 6459218 22 ) utm=0 stm=0 core=6 HZ=100
runtime.cc:655] | stack=0x710e6a8000-0x710e6aa000 stackSize=1043KB
runtime.cc:655] | held mutexes=
runtime.cc:655] native: #00 pc 000000000009c238 /apex/com.android.runtime/lib64/bionic/libc.so (__epoll_pwait+8)
runtime.cc:655] native: #1 pc 0000000000021e10 /data/data/com.awesomeproject/lib-main/libevent_core-2.1.so (???)
runtime.cc:655] native: #2 pc 00000000000170cc /data/data/com.awesomeproject/lib-main/libevent_core-2.1.so (event_base_loop+708)
runtime.cc:655] native: #3 pc 0000000000233c94 /data/data/com.awesomeproject/lib-main/libflipper.so (folly::EventBase::loopBody(int, bool)+688)
runtime.cc:655] native: #4 pc 0000000000234cb8 /data/data/com.awesomeproject/lib-main/libflipper.so (folly::EventBase::loopForever()+36)
runtime.cc:655] native: #5 pc 000000000018817c /data/data/com.awesomeproject/lib-main/libflipper.so (???)
runtime.cc:655] native: #6 pc 000000000013ced4 /apex/com.android.art/lib64/libart.so (art_quick_generic_jni_trampoline+148)
runtime.cc:655] native: #7 pc 0000000000133564 /apex/com.android.art/lib64/libart.so (art_quick_invoke_stub+548)
runtime.cc:655] native: #8 pc 00000000001a8a78 /apex/com.android.art/lib64/libart.so (art::ArtMethod::Invoke(art::Thread*, unsigned int*, unsigned int, art::JValue*, char const*)+200)
runtime.cc:655] native: #9 pc 0000000000318498 /apex/com.android.art/lib64/libart.so (art::interpreter::ArtInterpreterToCompiledCodeBridge(art::Thread*, art::ArtMethod*, art::ShadowFrame*, unsigned short, art::JValue*)+376)
runtime.cc:655] native: #10 pc 000000000030e7c4 /apex/com.android.art/lib64/libart.so (bool art::interpreter::DoCall<false, false>(art::ArtMethod*, art::Thread*, art::ShadowFrame&, art::Instruction const*, unsigned short, art::JValue*)+996)
runtime.cc:655] native: #11 pc 000000000067cb50 /apex/com.android.art/lib64/libart.so (MterpInvokeVirtual+848)
runtime.cc:655] native: #12 pc 000000000012d814 /apex/com.android.art/lib64/libart.so (mterp_op_invoke_virtual+20)
runtime.cc:655] native: #13 pc 0000000000251280 [anon:dalvik-classes.dex extracted in memory from /data/app/~~S6Tnotkz_tAqHvkpXQ_wNQ==/com.awesomeproject-Flj5aIPCqDxecv5FCDO96Q==/base.apk] (com.facebook.flipper.android.FlipperThread.run+40)
runtime.cc:655] native: #14 pc 0000000000305dc0 /apex/com.android.art/lib64/libart.so (art::interpreter::Execute(art::Thread*, art::CodeItemDataAccessor const&, art::ShadowFrame&, art::JValue, bool, bool) (.llvm.3728063326046250456)+268)
runtime.cc:655] native: #15 pc 000000000066b858 /apex/com.android.art/lib64/libart.so (artQuickToInterpreterBridge+780)
runtime.cc:655] native: #16 pc 000000000013cff8 /apex/com.android.art/lib64/libart.so (art_quick_to_interpreter_bridge+88)
runtime.cc:655] native: #17 pc 0000000000133564 /apex/com.android.art/lib64/libart.so (art_quick_invoke_stub+548)
runtime.cc:655] native: #18 pc 00000000001a8a78 /apex/com.android.art/lib64/libart.so (art::ArtMethod::Invoke(art::Thread*, unsigned int*, unsigned int, art::JValue*, char const*)+200)
2022-03-30 10:48:58.413 9869-25771/com.awesomeproject A/.awesomeprojec: runtime.cc:655] native: #19 pc 0000000000555234 /apex/com.android.art/lib64/libart.so (art::JValue art::InvokeVirtualOrInterfaceWithJValuesart::ArtMethod*(art::ScopedObjectAccessAlreadyRunnable const&, _jobject*, art::ArtMethod*, jvalue const*)+460)
runtime.cc:655] native: #20 pc 00000000005a45e4 /apex/com.android.art/lib64/libart.so (art::Thread::CreateCallback(void*)+1308)
runtime.cc:655] native: #21 pc 00000000000b0048 /apex/com.android.runtime/lib64/bionic/libc.so (__pthread_start(void*)+64)
runtime.cc:655] native: #22 pc 00000000000503c8 /apex/com.android.runtime/lib64/bionic/libc.so (__start_thread+64)
runtime.cc:655] at com.facebook.flipper.android.EventBase.loopForever(Native method)
runtime.cc:655] at com.facebook.flipper.android.FlipperThread.run(FlipperThread.java:31)
runtime.cc:655]
runtime.cc:655] "OkHttp http://localhost:8081/..." prio=5 tid=23 Native
runtime.cc:655] | group="" sCount=1 dsCount=0 flags=1 obj=0x13c00e20 self=0xb4000072eeaabfc0
runtime.cc:655] | sysTid=25748 nice=0 cgrp=default sched=0/0 handle=0x710ce83cc0
runtime.cc:655] | state=S schedstat=( 25061505 13102707 59 ) utm=2 stm=0 core=5 HZ=100
runtime.cc:655] | stack=0x710cd80000-0x710cd82000 stackSize=1043KB
runtime.cc:655] | held mutexes=
runtime.cc:655] native: #00 pc 000000000009bdf4 /apex/com.android.runtime/lib64/bionic/libc.so (recvfrom+4)
runtime.cc:655] native: #1 pc 0000000000028a2c /apex/com.android.art/lib64/libopenjdk.so (NET_Read+80)
runtime.cc:655] native: #2 pc 00000000000295a4 /apex/com.android.art/lib64/libopenjdk.so (SocketInputStream_socketRead0+216)
runtime.cc:655] at java.net.SocketInputStream.socketRead0(Native method)
runtime.cc:655] at java.net.SocketInputStream.socketRead(SocketInputStream.java:119)
runtime.cc:655] at java.net.SocketInputStream.read(SocketInputStream.java:176)
runtime.cc:655] at java.net.SocketInputStream.read(SocketInputStream.java:144)
runtime.cc:655] at okio.InputStreamSource.read(JvmOkio.kt:91)
runtime.cc:655] at okio.AsyncTimeout$source$1.read(AsyncTimeout.kt:129)
runtime.cc:655] at okio.RealBufferedSource.request(RealBufferedSource.kt:206)
runtime.cc:655] at okio.RealBufferedSource.require(RealBufferedSource.kt:199)
runtime.cc:655] at okio.RealBufferedSource.readByte(RealBufferedSource.kt:209)
runtime.cc:655] at okhttp3.internal.ws.WebSocketReader.readHeader(WebSocketReader.kt:119)
runtime.cc:655] at okhttp3.internal.ws.WebSocketReader.processNextFrame(WebSocketReader.kt:102)
runtime.cc:655] at okhttp3.internal.ws.RealWebSocket.loopReader(RealWebSocket.kt:293)
runtime.cc:655] at okhttp3.internal.ws.RealWebSocket$connect$1.onResponse(RealWebSocket.kt:195)
runtime.cc:655] at okhttp3.internal.connection.RealCall$AsyncCall.run(RealCall.kt:519)
runtime.cc:655] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1167)
runtime.cc:655] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:641)
runtime.cc:655] at java.lang.Thread.run(Thread.java:923)
runtime.cc:655]
runtime.cc:655] "HybridData DestructorThread" prio=5 tid=24 Waiting
runtime.cc:655] | group="" sCount=1 dsCount=0 flags=1 obj=0x13c03b88 self=0xb4000072eeab9e40
runtime.cc:655] | sysTid=25749 nice=0 cgrp=default sched=0/0 handle=0x710bd79cc0
runtime.cc:655] | state=S schedstat=( 7521714 2191149 10 ) utm=0 stm=0 core=3 HZ=100
runtime.cc:655] | stack=0x710bc76000-0x710bc78000 stackSize=1043KB
runtime.cc:655] | held mutexes=
runtime.cc:655] native: #00 pc 000000000004b48c /apex/com.android.runtime/lib64/bionic/libc.so (syscall+28)
runtime.cc:655] native: #1 pc 00000000001af9e0 /apex/com.android.art/lib64/libart.so (art::ConditionVariable::WaitHoldingLocks(art::Thread*)+164)
runtime.cc:655] native: #2 pc 000000000049a2b0 /apex/com.android.art/lib64/libart.so (art::Monitor::Wait(art::Thread*, long, int, bool, art::ThreadState)+544)
2022-03-30 10:48:58.414 9869-25771/com.awesomeproject A/.awesomeprojec: runtime.cc:655] native: #81 pc 000000000012da14 /apex/com.android.art/lib64/libart.so (mterp_op_invoke_interface+20)
runtime.cc:655] native: #82 pc 00000000000ed090 /apex/com.android.art/javalib/core-oj.jar (java.lang.Thread.run+8)
runtime.cc:655] native: #83 pc 0000000000305dc0 /apex/com.android.art/lib64/libart.so (art::interpreter::Execute(art::Thread*, art::CodeItemDataAccessor const&, art::ShadowFrame&, art::JValue, bool, bool) (.llvm.3728063326046250456)+268)
runtime.cc:655] native: #84 pc 000000000066b858 /apex/com.android.art/lib64/libart.so (artQuickToInterpreterBridge+780)
runtime.cc:655] native: #85 pc 000000000013cff8 /apex/com.android.art/lib64/libart.so (art_quick_to_interpreter_bridge+88)
runtime.cc:655] native: #86 pc 0000000000133564 /apex/com.android.art/lib64/libart.so (art_quick_invoke_stub+548)
runtime.cc:655] native: #87 pc 00000000001a8a78 /apex/com.android.art/lib64/libart.so (art::ArtMethod::Invoke(art::Thread*, unsigned int*, unsigned int, art::JValue*, char const*)+200)
runtime.cc:655] native: #88 pc 0000000000555234 /apex/com.android.art/lib64/libart.so (art::JValue art::InvokeVirtualOrInterfaceWithJValuesart::ArtMethod*(art::ScopedObjectAccessAlreadyRunnable const&, _jobject*, art::ArtMethod*, jvalue const*)+460)
runtime.cc:655] native: #89 pc 00000000005a45e4 /apex/com.android.art/lib64/libart.so (art::Thread::CreateCallback(void*)+1308)
runtime.cc:655] native: #90 pc 00000000000b0048 /apex/com.android.runtime/lib64/bionic/libc.so (__pthread_start(void*)+64)
runtime.cc:655] native: #91 pc 00000000000503c8 /apex/com.android.runtime/lib64/bionic/libc.so (__start_thread+64)
runtime.cc:655] at ai.onnxruntime.OrtSession.run(Native method)
runtime.cc:655] at ai.onnxruntime.OrtSession.run(OrtSession.java:293)
runtime.cc:655] at ai.onnxruntime.reactnative.OnnxruntimeModule.run(OnnxruntimeModule.java:210)
runtime.cc:655] at ai.onnxruntime.reactnative.OnnxruntimeModule.run(OnnxruntimeModule.java:91)
runtime.cc:655] at java.lang.reflect.Method.invoke(Native method)
runtime.cc:655] at com.facebook.react.bridge.JavaMethodWrapper.invoke(JavaMethodWrapper.java:372)
runtime.cc:655] at com.facebook.react.bridge.JavaModuleWrapper.invoke(JavaModuleWrapper.java:188)
runtime.cc:655] at com.facebook.react.bridge.queue.NativeRunnable.run(Native method)
runtime.cc:655] at android.os.Handler.handleCallback(Handler.java:938)
runtime.cc:655] at android.os.Handler.dispatchMessage(Handler.java:99)
runtime.cc:655] at com.facebook.react.bridge.queue.MessageQueueThreadHandler.dispatchMessage(MessageQueueThreadHandler.java:27)
runtime.cc:655] at android.os.Looper.loop(Looper.java:233)
runtime.cc:655] at com.facebook.react.bridge.queue.MessageQueueThreadImpl$4.run(MessageQueueThreadImpl.java:226)
runtime.cc:655] at java.lang.Thread.run(Thread.java:923)
runtime.cc:655] Pending exception ai.onnxruntime.OrtException: Error code - ORT_INVALID_ARGUMENT - message: Invalid rank for input: flatten_2_input Got: 4 Expected: 3 Please fix either the inputs or the model.
runtime.cc:655] at ai.onnxruntime.OnnxValue[] ai.onnxruntime.OrtSession.run(long, long, long, java.lang.String[], long[], long, java.lang.String[], long, long) (OrtSession.java:-2)
runtime.cc:655] at ai.onnxruntime.OrtSession$Result ai.onnxruntime.OrtSession.run(java.util.Map, java.util.Set, ai.onnxruntime.OrtSession$RunOptions) (OrtSession.java:293)
runtime.cc:655] at com.facebook.react.bridge.WritableMap ai.onnxruntime.reactnative.OnnxruntimeModule.run(java.lang.String, com.facebook.react.bridge.ReadableMap, com.facebook.react.bridge.ReadableArray, com.facebook.react.bridge.ReadableMap) (OnnxruntimeModule.java:210)
runtime.cc:655] at void ai.onnxruntime.reactnative.OnnxruntimeModule.run(java.lang.String, com.facebook.react.bridge.ReadableMap, com.facebook.react.bridge.ReadableArray, com.facebook.react.bridge.ReadableMap, com.facebook.react.bridge.Promise) (OnnxruntimeModule.java:91)
2022-03-30 10:48:58.677 25825-25825/? A/DEBUG: #87 pc 00000000005a45e4 /apex/com.android.art/lib64/libart.so (art::Thread::CreateCallback(void*)+1308) (BuildId: e93888dbf43a1db1709c2a9642072ca2)

@JonathanSum
Copy link
Author

I am not really sure what the doc's saying:
'''
React Native library doesn't support these features.

          Unsigned data type at Tensor

''''
There are no unsigned data type in js if I remember correctly. Or is it saying the model can only allow numbers ranging from 0 to pos inf?

@faxu
Copy link
Contributor

faxu commented Apr 12, 2022

@JonathanSum Since you have opened and closed this issue many times and referenced a few other issues, can you clarify what you need help with at this time?

@JonathanSum
Copy link
Author

JonathanSum commented Apr 12, 2022

I created a dummy tensor with dimension, dims = [1, 1, 28, 28];
I input it to the minst.ort model, and it didn't give me an result and gave me some error in above.

In addition, the doc said it does not support unsigned tensor type because of RN library. I do not understand this. Is it saying JavaScript has an unsigned data type?

In addition, I was trying to do NLP question and answering, and some model such as bert QA will use -1 to denote no answer. Does it mean ONNX RN can not work in those model because it may output -1.

I think the real issue of this ONNX RN library is it can not be used like the ONNXJS.

That is why we need an tutorial on how to load model and do inference for React Native in CV, NLP, ASR , and processing tensor. @faxu

And that is why I told @FFPTech-Sebastien, this should be explained more detailly by ONNX team.

@fs-eire
Copy link
Contributor

fs-eire commented Apr 18, 2022

Hi @JonathanSum, Sorry for the late response. I took some time to reproduce the issue from scratch.

Following are answers to some of your questions:

  • building for Android requires Java v8 or v11 + gradle v6
  • <ORT_ROOT> means the root folder of ONNX Runtime repository. As mentioned in the document, if you don't build from source, you can skip this step.
  • The restriction that not able to use unsigned integer data type is because those types are not supported by Java ( which we uses to build up Android apps ). This only affects the types of model inputs/outputs, and if you do have unsigned integer data type for model inputs/outputs, there is a workaround to change the model input types to signed integer by connecting the input into a Cast<int, uint> kernel.
  • As mentioned above, only unsigned integer data types are not supported. There is no problem to output number -1 (which is a signed integer, usually int32).

I think the real issue of this ONNX RN library is it can not be used like the ONNXJS.

The API should be totally same for ORT Nodejs, ORT Web and ORT RN. If it is not, we should fix it. There might be slightly differences in some details, like the URI of the model has different definition for InferenceSession.create(), but they should be straightforward to figure out.

Please let me know if you have further questions.

@JonathanSum
Copy link
Author

JonathanSum commented Apr 18, 2022

@fs-eire

I tried to input a dummy variable from Javascript and input it to the loaded ONNX model, and it just crashed my app and throw the error code in Android Studio(error code in above). Do you know why?

       let modelPath = null;
        let session = null;
        try {
            modelPath = await ONNX.getLocalModelPath();
            console.log('found file path out');
            console.log(modelPath);
            session = await InferenceSession.create(modelPath);

            const dims = [1, 1, 28, 28];
            // const dims = [1, 1, 28, 28];

            // const float32Data = new Float32Array(dims[1] * dims[2] * dims[3]);
            const float32Data = new Float32Array(784);

            for (let i = 0; i < float32Data.length; i++) {
                float32Data[i] += 1.0;
            }

            const inputTensor = new Tensor('float32', float32Data, dims);

            console.log(session.inputNames);

            console.log(session.outputNames);
            console.log('started');
            // console.log(float32Data.length);
            // console.log(inputTensor.data);
            console.log(inputTensor.dims);
            console.log(inputTensor.size);
            console.log('ended');

            const feeds = {};

            feeds[session.inputNames[0]] = inputTensor;
            const output = session.run(feeds);
            console.log(output);
        } catch (e) {
            console.log('Error, did not found the file path out');
            console.error(e);
        }


        console.log(DEFAULT_EVENT_NAME);

    };

Can you tell me I can do an inference or not with the dummy tensor in above if the mist.ort model is loaded?

@JonathanSum
Copy link
Author

JonathanSum commented Apr 19, 2022

Let me add more:
@faxu asked why I kept opening it and closing it.
Here is why:
I was able to load the model after finding out the reason why I can't load the model like everyone else. Thus, I thought shouldn't it have the same reason that makes it unable to do an inference if I do not use "that way"?
In the end, I really found out I can not do inference too. I posted the code above. And I still want to ask why it can not do an inference normally because I am afraid I am wrong.

I also pin another issue here because I think he had the same issue as I had before.
##11239

See! I was able to load the model, but I can not do inference at the end.
image

@JonathanSum JonathanSum changed the title [Lot of people have this issue] The "onnxruntime-react-native" just can not find the model or load the model, The model can not do inference; [Lot of people have this issue] The "onnxruntime-react-native" just can not find the model or load the model; The model can not do inference; Apr 19, 2022
@jackylu0124
Copy link

Hi Jonathan, could you please explain what you did in order to be able to load in the model? And also could you explain what ONNX.getLocalModelPath() does? I couldn't seem to find that function in the API doc.😅 Thanks a lot in advance!

@JonathanSum
Copy link
Author

JonathanSum commented Apr 20, 2022

image
You can see the image showed that the model even made the app crashed with the dummy tensor to the loaded model.




@jackylu0124

I can't really give you a solution because I think it won't be much helpful at all, _especially you can't do inference like the docs. But I guess I know the solution. However, I guess you will find a better solution than mine and we should follow your solution.
Sorry.

lvczcb33x9d81 - Copy

I hope ONNX will make it better because I trust the ONNX team(fs-eire,faxu, and more ). 😀😆😆😆😆

@JonathanSum JonathanSum changed the title [Lot of people have this issue] The "onnxruntime-react-native" just can not find the model or load the model; The model can not do inference; [Lot of people have this issue] The model can not "really" do inference; Loading issue is solved. Apr 20, 2022
@fs-eire
Copy link
Contributor

fs-eire commented Apr 22, 2022

The documents are definitely not good enough to give a clean and straightforward instructions. I am working on reproducing the issues and marking the problems. After that I will send out PR to update the document (and the code, if any code change is necessary).

@JonathanSum
Copy link
Author

@fs-eire
I still don't understand why the dummy tensor with the shape of [1, 1, 28, 28] does not work on the minst.ort model. Is that the size wrong? Or it should be 1, 28, 28, 1?

@fs-eire
Copy link
Contributor

fs-eire commented May 4, 2022

@fs-eire I still don't understand why the dummy tensor with the shape of [1, 1, 28, 28] does not work on the minst.ort model. Is that the size wrong? Or it should be 1, 28, 28, 1?

I can reproduce this issue. The shape [1, 1, 28, 28] is correct. I am investigating the cause of the crash.

BTW I am working on this repo: https://github.com/fs-eire/ort-rn-hello-world to log my steps of reproducing the issue.

@fs-eire
Copy link
Contributor

fs-eire commented May 5, 2022

OK.. I think I was wrong. the shape actually requires [1, 28, 28] as dims. this is why it fails.

however, crash is not the expected behavior. the correct behavior should reject the Promise with the error message. i will take a look at how to conduct a fix.

@JonathanSum

@jackylu0124
Copy link

jackylu0124 commented May 31, 2022

@fs-eire
Hey guys, I have a similar issue (#11239) where I cannot load a very simple model (literally just a simple linear blend operation), could anyone please take a look? I made the issue more than a month ago, but still got no updates or replies. Thanks in advance!

@fs-eire
Copy link
Contributor

fs-eire commented May 31, 2022

@fs-eire Hey guys, I have a similar issue (#11239) where I cannot load a very simple model (literally just a simple linear blend operation), could anyone please take a look? I made the issue more than a month ago, but still got no updates or replies. Thanks in advance!

I replied in the original issue.

@jackylu0124
Copy link

@fs-eire Just saw it, thank you very much for the update! I have a follow-up question in the original issue.

@JonathanSum
Copy link
Author

JonathanSum commented Jun 1, 2022

Just sharing to people that I somehow found another way to work. I used distill bert for one of my projects:

image
I used Pytorch Distill Bert as a starting point. Of course you can use things like T5 or even larger models.
It runs fine on my years ago OnePlus7

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
platform:mobile issues related to ONNX Runtime mobile; typically submitted using template
Projects
None yet
Development

No branches or pull requests

7 participants