Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Adding adapters and fusion layer to UNIPELT #664

Closed
calpt opened this issue Mar 31, 2024 Discussed in #657 · 0 comments · Fixed by #665
Closed

Adding adapters and fusion layer to UNIPELT #664

calpt opened this issue Mar 31, 2024 Discussed in #657 · 0 comments · Fixed by #665
Assignees
Labels
enhancement New feature or request

Comments

@calpt
Copy link
Member

calpt commented Mar 31, 2024

Discussed in #657

Originally posted by san-deep-reddy March 16, 2024
image

Instead of the single adapter in UNIPELT, I am trying to add multiple adapters and a fusion layer on top using -

config = ConfigUnion(LoRAConfig(r=8, use_gating=True), PrefixTuningConfig(prefix_length=10, use_gating=True))
model.add_adapter("unipelt", config=config)
seq_config = SeqBnConfig(reduction_factor=16, use_gating=True)
model.add_adapter("adapter1", config=seq_config) #Adapter1
model.add_adapter("adapter2", config=seq_config) #Adapter2
model.add_adapter("adapter3", config=seq_config) #Adapter3
adapter_setup = Fuse("adapter1", "adapter2", "adapter3")
model.add_adapter_fusion(adapter_setup) #Adapter fusion
model.set_active_adapters([adapter_setup, 'adapter1', 'unipelt'])

(or)

lora_config = LoRAConfig(r=8, use_gating=True)
pft_config = PrefixTuningConfig(prefix_length=10, use_gating=True)
seq_config = SeqBnConfig(reduction_factor=16, use_gating=True)
model.add_adapter("pft", config=pft_config)
model.add_adapter("lora", config=lora_config)
model.add_adapter("adapter1", config=seq_config) #Adapter1
model.add_adapter("adapter2", config=seq_config) #Adapter2
model.add_adapter("adapter3", config=seq_config) #Adapter3
adapter_setup = Fuse("adapter1", "adapter2", "adapter3")
model.add_adapter_fusion(adapter_setup) #Adapter fusion
model.set_active_adapters([adapter_setup, 'adapter1', 'pft', 'lora'])

I end up getting this error in both cases -

File "/opt/conda/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1527, in _call_impl
return forward_call(*args, **kwargs)
File "/opt/conda/lib/python3.10/site-packages/adapters/methods/lora.py", line 440, in forward
state = self.compose(adapter_setup, state)
File "/opt/conda/lib/python3.10/site-packages/adapters/methods/adapter_layer_base.py", line 472, in compose
state = composition_func(adapter_setup, state, lvl=0)
File "/opt/conda/lib/python3.10/site-packages/adapters/methods/adapter_layer_base.py", line 305, in compose_stack
state = composition_func(adapter_stack_layer, state, lvl=lvl + 1)
File "/opt/conda/lib/python3.10/site-packages/adapters/methods/adapter_layer_base.py", line 323, in compose_fuse
raise NotImplementedError()
NotImplementedError

How do I resolve this?

@calpt calpt added the bug Something isn't working label Mar 31, 2024
@calpt calpt self-assigned this Mar 31, 2024
@calpt calpt added enhancement New feature or request and removed bug Something isn't working labels Apr 14, 2024
TimoImhof pushed a commit that referenced this issue Apr 15, 2024
Fixes #664.

Changes in this PR:
- Avoid throwing `NotImplementedError` in an unsupported block if none
if the child adapters are part of the respective layer.
- Pass along "last" invoked adapter module name in LoRA & bottleneck
states to make sure "last" is actually existing in the respective layer.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant