Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

In-browser inference with AI-Mask #1590

Closed
wants to merge 9 commits into from

Conversation

pacoccino
Copy link

Adding support for AI-Mask extension as a local model provider in addition to ollama

AI-Mask is a wrapper on top of libraries such as web-llm and transformers.js and enable executing models directly in the browser. The extension caches the models and provide them to whatever web app need them, like this one.

A live demo of this is available here (Install the extension before)

@pacoccino pacoccino changed the title Local, in-browser inference with AI-Mask In-browser inference with AI-Mask Mar 22, 2024
@mckaywrigley
Copy link
Owner

Looks interesting but going to hold off for now.

If this project becomes more popular we'll certainly reconsider support!

@pacoccino
Copy link
Author

I was hoping to get AI-mask to be known by integrating in some great open source projects that would benefit from it, like this one.

Maybe you could elaborate on what would refrain you from integrating it ? Would you have specific requirements to match your expectations ?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants