You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
File "/opt/conda/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1527, in _call_impl
return forward_call(*args, **kwargs)
File "/opt/conda/lib/python3.10/site-packages/adapters/methods/lora.py", line 440, in forward
state = self.compose(adapter_setup, state)
File "/opt/conda/lib/python3.10/site-packages/adapters/methods/adapter_layer_base.py", line 472, in compose
state = composition_func(adapter_setup, state, lvl=0)
File "/opt/conda/lib/python3.10/site-packages/adapters/methods/adapter_layer_base.py", line 305, in compose_stack
state = composition_func(adapter_stack_layer, state, lvl=lvl + 1)
File "/opt/conda/lib/python3.10/site-packages/adapters/methods/adapter_layer_base.py", line 323, in compose_fuse
raise NotImplementedError()
NotImplementedError
How do I resolve this?
The text was updated successfully, but these errors were encountered:
Fixes#664.
Changes in this PR:
- Avoid throwing `NotImplementedError` in an unsupported block if none
if the child adapters are part of the respective layer.
- Pass along "last" invoked adapter module name in LoRA & bottleneck
states to make sure "last" is actually existing in the respective layer.
Discussed in #657
Originally posted by san-deep-reddy March 16, 2024
Instead of the single adapter in UNIPELT, I am trying to add multiple adapters and a fusion layer on top using -
config = ConfigUnion(LoRAConfig(r=8, use_gating=True), PrefixTuningConfig(prefix_length=10, use_gating=True))
model.add_adapter("unipelt", config=config)
seq_config = SeqBnConfig(reduction_factor=16, use_gating=True)
model.add_adapter("adapter1", config=seq_config) #Adapter1
model.add_adapter("adapter2", config=seq_config) #Adapter2
model.add_adapter("adapter3", config=seq_config) #Adapter3
adapter_setup = Fuse("adapter1", "adapter2", "adapter3")
model.add_adapter_fusion(adapter_setup) #Adapter fusion
model.set_active_adapters([adapter_setup, 'adapter1', 'unipelt'])
(or)
lora_config = LoRAConfig(r=8, use_gating=True)
pft_config = PrefixTuningConfig(prefix_length=10, use_gating=True)
seq_config = SeqBnConfig(reduction_factor=16, use_gating=True)
model.add_adapter("pft", config=pft_config)
model.add_adapter("lora", config=lora_config)
model.add_adapter("adapter1", config=seq_config) #Adapter1
model.add_adapter("adapter2", config=seq_config) #Adapter2
model.add_adapter("adapter3", config=seq_config) #Adapter3
adapter_setup = Fuse("adapter1", "adapter2", "adapter3")
model.add_adapter_fusion(adapter_setup) #Adapter fusion
model.set_active_adapters([adapter_setup, 'adapter1', 'pft', 'lora'])
I end up getting this error in both cases -
File "/opt/conda/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1527, in _call_impl
return forward_call(*args, **kwargs)
File "/opt/conda/lib/python3.10/site-packages/adapters/methods/lora.py", line 440, in forward
state = self.compose(adapter_setup, state)
File "/opt/conda/lib/python3.10/site-packages/adapters/methods/adapter_layer_base.py", line 472, in compose
state = composition_func(adapter_setup, state, lvl=0)
File "/opt/conda/lib/python3.10/site-packages/adapters/methods/adapter_layer_base.py", line 305, in compose_stack
state = composition_func(adapter_stack_layer, state, lvl=lvl + 1)
File "/opt/conda/lib/python3.10/site-packages/adapters/methods/adapter_layer_base.py", line 323, in compose_fuse
raise NotImplementedError()
NotImplementedError
How do I resolve this?
The text was updated successfully, but these errors were encountered: