Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Rebuild the Model explorer backend - To get the full weights values #82

Closed
tsantosh1098 opened this issue Jun 11, 2024 · 7 comments
Closed
Assignees
Labels
status:awaiting user response This label needs to be added to stale issues and PRs. status:stale Stale management add this label for inactive PRs and issues. type:build/install Build and install issues type:support For use-related issues

Comments

@tsantosh1098
Copy link

tsantosh1098 commented Jun 11, 2024

I am trying to modify the ConvertFlatbufferDirectlyToJson() part of the direct_flatbuffer_to_json_graph_convert.cc file.

To re-build the builtin adaptors I followed these steps - https://github.com/google-ai-edge/model-explorer/tree/main/src/builtin-adapter. But after generating the .so file now I am not able to load the extension module ".builtin_tflite_mlir_adapter".

! Failed to load extension module ".builtin_tflite_mlir_adapter":
/home/armnn/Documents/mcw/model_explorer/lib/python3.12/site-packages/ai_edge_model_explorer_adapter/_pywrap_convert_wrapper.so: undefined symbol:
_ZN10tensorflow11CSRMatMulOpIN5Eigen16ThreadPoolDeviceEfEC2EPNS_20OpKernelConstructionE

When I check the path of the .so file, its present in the provided directory.

Currently, I am building this on ubuntu 24 machine - with Bazel 6.5.0, on Intel(R) Core(TM) i7-8665U CPU @ 1.90GHz

Screenshot from 2024-06-11 14-01-39

@tsantosh1098
Copy link
Author

so i built for ai_edge_model_explorer_adapter 0.1.0

I have ai_edge_model_explorer_adapter 0.1.0 which is installed using internet downloaded .whl file
The another ai_edge_model_explorer_adapter 0.1.0 is generated by the local built .whl file
So after i inspected the symbols in both shared libraries here is the difference.
Note: The first one is working, second one (local built is not working)

Is this problem bcz of the glic version ?

Screenshot from 2024-06-11 14-56-53

@pkgoogle
Copy link
Contributor

Hi @tsantosh1098, can you share the modifications to

ConvertFlatbufferDirectlyToJson() part of the direct_flatbuffer_to_json_graph_convert.cc

So that we may match your changes and test this out? Thanks for your help.

@pkgoogle pkgoogle self-assigned this Jun 12, 2024
@pkgoogle
Copy link
Contributor

Hi @tsantosh1098, are you sure you need your changes? Have you tried adjusting the maximum element count for constant tensor values:

image

If you enter a -1 here, it will allow you to see unlimited tensor values. Let me know if you have any more questions.

@pkgoogle pkgoogle added the type:support For use-related issues label Jun 13, 2024
@tsantosh1098
Copy link
Author

Hi @pkgoogle, Thanks for the information.

I was able to achieve to get the full weights by setting the config.const_element_count_limit = -1 in the builtin adapter files.
I though there was a need to do some changes in the backend, but backend already had a support to extract full length weights.

Thanks once again

@pkgoogle
Copy link
Contributor

Hi @tsantosh1098, just to be clear the UI solution also works right? (We'll need to add it to the wiki so I wanted to verify :) ). Also if you have no more open items regarding this issue, please feel free to close as completed, thanks!

@pkgoogle pkgoogle added the status:awaiting user response This label needs to be added to stale issues and PRs. label Jun 26, 2024
Copy link

github-actions bot commented Jul 4, 2024

Marking this issue as stale since it has been open for 7 days with no activity. This issue will be closed if no further activity occurs.

@github-actions github-actions bot added the status:stale Stale management add this label for inactive PRs and issues. label Jul 4, 2024
Copy link

This issue was closed because it has been inactive for 14 days. Please post a new issue if you need further assistance. Thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
status:awaiting user response This label needs to be added to stale issues and PRs. status:stale Stale management add this label for inactive PRs and issues. type:build/install Build and install issues type:support For use-related issues
Projects
None yet
Development

No branches or pull requests

3 participants