Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix python3.7 Compatibility #510

Merged
merged 8 commits into from
Mar 25, 2023

Conversation

lenglaender
Copy link
Member

Compatibility with Python 3.7+ an Pytorch 1.12.0+:

@lenglaender
Copy link
Member Author

Fixes #509

@lenglaender lenglaender changed the title Fix/python3 7 python3.7 Compatibility Mar 7, 2023
@lenglaender lenglaender changed the title python3.7 Compatibility Fix python3.7 Compatibility Mar 7, 2023
@hSterz
Copy link
Member

hSterz commented Mar 8, 2023

This looks good. But we should consider, whether we want to continue to support python 3.7. The adapter-transformers version 3.2 we released does not. So "going back" in the supported python version might be confusing. Additionally, python 3.7 is only supported till June this year. It might make more sense to update our documentation and only support version 3.8 or higher.

@lenglaender lenglaender requested a review from hSterz March 14, 2023 13:49
@calpt calpt linked an issue Mar 16, 2023 that may be closed by this pull request
@lenglaender lenglaender merged commit 449f541 into adapter-hub:master Mar 25, 2023
hSterz added a commit to hSterz/adapter-transformers that referenced this pull request Aug 5, 2023
Fix resume_from_checkpoint (adapter-hub#514)

add initialization of variable so invalid checkpoints throw a understandable error

Fix LoRA & (IA)³ implementation for Bart & MBart (adapter-hub#518)

Fixes a critical issue in the LoRA & (IA)³ implementation of Bart & MBart, where LoRA & (IA)³ weights were not added to the intermediate and output linear layers of the model's decoder blocks.

I.e., adapter configs having intermediate_lora=True or output_lora=True are added incorrectly to (M)Bart models. For LoRA, this does not affect the default config, for (IA)³ it does (intermediate_lora=True).

To ensure correct addition of weights in the future, get_adapter() tests are updated to count the number of modules added per adapter.

Fix python3.7 Compatibility (adapter-hub#510)

Compatibility with python3.8+ and pytorch1.12.1+

Restore compatibility in GPT-2 LoRALinear bias init (adapter-hub#525)

Fix compacter init weights (adapter-hub#516)

Update doc chapter "Getting Started" (adapter-hub#527)

Update version to 3.2.1

Fix Notebook01 Dataset column_rename (adapter-hub#543)

Update doc chapter "Adapter Methods" (adapter-hub#535)

Do not stale issues labeled as bugs (adapter-hub#550)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Python version
2 participants