Skip to content

Adapters v0.1.1

Compare
Choose a tag to compare
@calpt calpt released this 09 Jan 21:16
· 63 commits to main since this release

This version is built for Hugging Face Transformers v4.35.x.

New

  • Add leave_out to LoRA and (IA)³ (@calpt via #608)

Fixed

  • Fix error in push_adapter_to_hub() due to deprecated args (@calpt via #613)
  • Fix Prefix-Tuning for T5 models where d_kv != d_model / num_heads (@calpt via #621)
  • [Bart] Move CLS rep extraction from EOS tokens to head classes (@calpt via #624)
  • Fix adapter activation with skip_layers/ AdapterDrop training (@calpt via #634)

Docs & Notebooks

  • Update notebooks & add new complex configuration demo notebook (@hSterz & @calpt via #614)