Skip to content

Adapters v0.2.0

Compare
Choose a tag to compare
@calpt calpt released this 25 Apr 13:54
· 33 commits to main since this release

This version is built for Hugging Face Transformers v4.39.x.

New

Changed

Fixed

  • Fix DataParallel training with adapters (@calpt via #658)
  • Fix embedding Training Bug (@hSterz via #655)
  • Fix fp16/ bf16 for Prefix Tuning (@calpt via #659)
  • Fix Training Error with AdapterDrop and Prefix Tuning (@TimoImhof via #673)
  • Fix default cache path for adapters loaded from AH repo (@calpt via #676)
  • Fix skipping composition blocks in not applicable layers (@calpt via #665)
  • Fix Unipelt Lora default config (@calpt via #682)
  • Fix compatibility of adapters with HF Accelerate auto device-mapping (@calpt via #678)
  • Use default head dropout prob if not provided by model (@calpt via #685)