-
Notifications
You must be signed in to change notification settings - Fork 122
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
autotvm:Cannot find tuning records for: target=c -keys=cpu -model=esp32 (AIV-653) #139
Comments
If it helps, here is my Colab script that trains a model in torch, converts it to ONNX, and attempts to convert it to ESP-DL with a template project: https://colab.research.google.com/gist/ShawnHymel/82d1a11278da45831f0d943d44ea2cc1/pytorch-mnist-onnx-quantization.ipynb |
Not sure if it's related, but if I change the
I get the following output:
It looks like it's failing the debug step, and I've been unable to track down exactly where. |
Not sure why it was failing for you. |
Hi @noorhaq, Thanks for checking it out! The notebook does indeed run, but the model fails the debugging check step. You have to uncomment the following line in esp-dl/tools/tvm/export-onnx-model.py to see the failure:
|
Hi, |
Hi @Auroragan, Good to know, thank you! I'll close out the issue. |
Currently, I am working with ESP-DL TVM and I got the same issue while uncommenting the debug function. The “error” message: is created because of the debug function. It looks like the example is based on the following model from ESP-DL tutorial https://github.com/espressif/esp-dl/blob/master/tutorial/tvm_example/model.onnx Due to that @ShawnHymel, you got that error with the debug output. If you use the example model, you won't get that error. You could also change the output of any layer you want to read, that is specified by your own model. Of course you have to adjust the I suggest removing this code or put it in another example. It only leads to confusion in my opinion. |
I am following the guide here: https://docs.espressif.com/projects/esp-dl/en/latest/esp32s3/tutorials/deploying-models-through-tvm.html.I trained a basic 2-layer DNN on MNIST using torch and exported the model in ONNX format. I saved a representative dataset train_set.npy and a single sample as sample.npy. I optimized and quantized the model with the following:
I then try to convert the quantized model to ESP-DL format with the following:
When I enable logging in the export_onnx_model.py script, I get the following:
It looks like the model is converted. However, autotvm complains about not finding tuning records:
This does not appear to be the intended behavior of the script. Any suggestions on how to fix this issue?
The text was updated successfully, but these errors were encountered: