Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

pn.chat.WebLLM module #7298

Open
1 task done
ahuang11 opened this issue Sep 18, 2024 · 3 comments
Open
1 task done

pn.chat.WebLLM module #7298

ahuang11 opened this issue Sep 18, 2024 · 3 comments
Labels
type: discussion Requiring community discussion type: feature A major new feature

Comments

@ahuang11
Copy link
Contributor

ahuang11 commented Sep 18, 2024

With the release of Panel's integration of JSComponent and ability to couple with WebLLM, I believe that, Panel has a great opportunity to distinguish itself from other dashboarding libraries besides its amazing integration with interactive plotting (HoloViews) by leveraging its ability to run LLMs directly on the client side with minimal setup.

I propose we migrate the example from this page into a module and make it easy for developers to use or extend it.

  • I may be interested in making a pull request to address this
@ahuang11 ahuang11 added the TRIAGE Default label for untriaged issues label Sep 18, 2024
@MarcSkovMadsen
Copy link
Collaborator

I believe that in an ideal world Panel provided core building blocks and panel-xyz packages provided ad on functionality.

This would enable specialized communication, communities, and documentation as well as a faster release cycle for the add on projects.

I believe chat and WebLLM in an ideal world falls in to the addon category.

If we truly believe webllm can create value for Panel it's ok for me in a practical world to include in Panel if we have the people to support it.

As I see it one area where Panel is missing components is systematically supporting ML and Dl tasks similarly to Gradio. We don't have the (high quality) components and high level interface needed to easily and systematically support X media to Y media tasks. An example task is speech to text, where we don't have an audio input component. The question is similarly if those components should live in Panel or an extension package. I see them as more core to Panel than WebLLM.

@ahuang11
Copy link
Contributor Author

ahuang11 commented Sep 19, 2024

I'd love to see WebLLM directly integrated within Panel chat because it helps user quickly get started with LLMs without the headache of installing another package. Plus, WebLLM requires no additional dependencies besides downloading & caching the model.

As a datapoint, Prefect used to have many integrations separated into their own repos:
https://github.com/prefecthq/prefect-email, https://github.com/prefecthq/prefect-sqlalchemy, etc, but over time, the team migrated them back into the main repo, PrefectHQ/prefect.
image

I suppose the reason being was that integrations in separate repos weren't very visible so users weren't aware of them.

This would enable specialized communication, communities, and documentation as well as a faster release cycle for the add on projects.

I think we first need to make more users adopt Panel to achieve that.

@philippjfr
Copy link
Member

My feeling aligns with @MarcSkovMadsen here, I believe there's benefits to making this a separate project. Panel has grown very large and I'd like to come up with a real extension mechanism to stop it from continuing to grow ever further. Since Panel has to ship the bundled code, each component further bloats the distributed package.

I do get your point though @ahuang11, I believe the only way extensions will be successful is if we make them visible. As we already discussed internally at minimum that would include:

  • A community page
  • Listing selected external components in the Component Gallery

Talking specifically about WebLLM, I feel quite strongly that the integration between the ChatInterface and the LLM should be co-operative, i.e. rely on composition rather than subclassing. To be concrete that means that the LLM provides a callback that can be passed to ChatInterface rather than creating a WebLLMChatInterface.

@philippjfr philippjfr added type: feature A major new feature type: discussion Requiring community discussion and removed TRIAGE Default label for untriaged issues labels Sep 19, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
type: discussion Requiring community discussion type: feature A major new feature
Projects
None yet
Development

No branches or pull requests

3 participants