Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Do we have a roadmap? #128

Closed
CharlesRiggins opened this issue Aug 29, 2024 · 2 comments
Closed

Do we have a roadmap? #128

CharlesRiggins opened this issue Aug 29, 2024 · 2 comments
Labels
enhancement New feature or request

Comments

@CharlesRiggins
Copy link

CharlesRiggins commented Aug 29, 2024

We've found the llm-compressor to be a powerful quantization tool that integrates seamlessly with vLLM. We are considering using llm-compressor to replace tools like AutoAWQ, AutoGPTQ, and others in our production.

We would love to learn more about the future direction of llm-compressor. Could you please share if there is a roadmap available, similar to what vLLM has (as seen here)?

Thank you for your excellent work.

@CharlesRiggins CharlesRiggins added the enhancement New feature or request label Aug 29, 2024
@CharlesRiggins CharlesRiggins changed the title Do we have a roadmap for Do we have a roadmap? Aug 29, 2024
@robertgshaw2-neuralmagic
Copy link
Collaborator

Current roadmap is here:

Any requests would be great!

@CharlesRiggins
Copy link
Author

Great. Thank you for replying.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants