Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update attention.py #1416

Merged
merged 4 commits into from
Oct 9, 2023
Merged

Update attention.py #1416

merged 4 commits into from
Oct 9, 2023

Commits on Sep 27, 2023

  1. Update attention.py

    modify the code about bigcode. 
    This modification makes the KV cache with multiple new tokens works well.
    DongHande authored Sep 27, 2023
    Configuration menu
    Copy the full SHA
    de2cf04 View commit details
    Browse the repository at this point in the history

Commits on Oct 9, 2023

  1. consider batch size = 1

    DongHande authored Oct 9, 2023
    Configuration menu
    Copy the full SHA
    7376b42 View commit details
    Browse the repository at this point in the history
  2. Update attention.py

    DongHande authored Oct 9, 2023
    Configuration menu
    Copy the full SHA
    34b7ab1 View commit details
    Browse the repository at this point in the history
  3. def kv_seq_len

    DongHande authored Oct 9, 2023
    Configuration menu
    Copy the full SHA
    ae73232 View commit details
    Browse the repository at this point in the history