Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Android App load local model #10

Open
AnswerZhao opened this issue Feb 29, 2024 · 3 comments
Open

Android App load local model #10

AnswerZhao opened this issue Feb 29, 2024 · 3 comments

Comments

@AnswerZhao
Copy link

Could you modify the home activity to load the local model instead? The network speed for downloading the model is quite slow. Thank you for sharing this amazing AI project and Android app.

@OmkarThawakar
Copy link
Member

Hi @AnswerZhao,

Thank you for interest in our work.

Please refer to llama.cpp (https://github.com/ggerganov/llama.cpp) and llama_cpp_dart (https://github.com/netdur/llama_cpp_dart) for loading local model.

Thanks,

@nonetrix
Copy link

nonetrix commented Mar 2, 2024

You could likely get faster inference using the native android APIs to use NPU but would be a undertaking for sure

@AnswerZhao
Copy link
Author

@OmkarThawakar Thank you for the references. I will give it a try.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants