Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add adapter_summary() method #371

Merged
merged 4 commits into from
Jun 27, 2022
Merged

Conversation

calpt
Copy link
Member

@calpt calpt commented Jun 20, 2022

This PR introduces a new adapter_summary() method that provides a summary of the currently added adapters.

Example

model = AutoAdapterModel.from_pretrained("roberta-base")
for name, config in ADAPTER_CONFIG_MAP.items():
    model.add_adapter(name, config=config)
print(model.adapter_summary())

outputs...

Name                     Architecture         #Param      %Param  Active   Train
--------------------------------------------------------------------------------
pfeiffer                 bottleneck           894528       0.718       0       1
houlsby                  bottleneck          1789056       1.435       0       1
pfeiffer+inv             bottleneck          1190592       0.955       0       1
houlsby+inv              bottleneck          2085120       1.673       0       1
compacter++              bottleneck            28576       0.023       0       1
compacter                bottleneck            57088       0.046       0       1
prefix_tuning            prefix_tuning       9872384       7.920       0       1
prefix_tuning_flat       prefix_tuning        552960       0.444       0       1
parallel                 bottleneck          7091712       5.689       0       1
scaled_parallel          bottleneck          7091724       5.690       0       1
lora                     lora                 294912       0.237       0       1
mam                      union              22493984      18.046       0       1
--------------------------------------------------------------------------------
Full model                                 124645632     100.000               1

Other fixes

Fix get_adapter() for:

  • prefix tuning
  • adapters w. shared or invertible adapters

Fix `get_adapter()` for:
- prefix tuning
- adapters w. shared or invertible adapters
@JoPfeiff
Copy link
Member

❤️

@calpt calpt marked this pull request as ready for review June 20, 2022 21:26
@calpt calpt merged commit 60fc367 into adapter-hub:master Jun 27, 2022
@calpt calpt deleted the dev/adapter_summary branch June 27, 2022 12:05
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants