Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Rename AdapterConfigBase to AdapterConfig #603

Merged
merged 4 commits into from
Nov 16, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion docs/classes/adapter_config.rst
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ Classes representing the architectures of adapter modules and fusion layers.
Single (bottleneck) adapters
~~~~~~~~~~~~~~~~~~~~~~~~~~~~

.. autoclass:: adapters.AdapterConfigBase
.. autoclass:: adapters.AdapterConfig
:members:

.. autoclass:: adapters.BnConfig
Expand Down
4 changes: 2 additions & 2 deletions docs/contributing/adding_adapter_methods.md
Original file line number Diff line number Diff line change
Expand Up @@ -18,14 +18,14 @@ Therefore, the described steps might not be applicable to each implementation.
These module classes then have to be inserted into the correct locations within the Transformer model implementation.
Thus, each adapter method implementation at least should provide two classes:

- a configuration class deriving from `AdapterConfigBase` that provides attributes for all configuration options of the method
- a configuration class deriving from `AdapterConfig` that provides attributes for all configuration options of the method
- a module class deriving from the abstract `AdapterLayerBase` that provides the method parameters and a set of standard adapter management functions
- modules supporting [adapter composition](https://docs.adapterhub.ml/adapter_composition.html) should instead derive from `ComposableAdapterLayerBase`

### Configuration

All configuration classes reside in `src/adapters/configuration/adapter_config.py`.
- To add a new configuration class for a new method, create a new subclass of [`AdapterConfigBase`](adapters.AdapterConfigBase).
- To add a new configuration class for a new method, create a new subclass of [`AdapterConfig`](adapters.AdapterConfig).
Make sure to set the `architecture` attribute in your class.
- Finally, also make sure the config class is added to the `__init__.py` files in `src/adapters`.

Expand Down
2 changes: 1 addition & 1 deletion docs/overview.md
Original file line number Diff line number Diff line change
Expand Up @@ -64,7 +64,7 @@ Identifiers and configuration classes are explained in more detail in the [next
## Configuration

All supported adapter methods can be added, trained, saved and shared using the same set of model class functions (see [class documentation](adapters.ModelAdaptersMixin)).
Each method is specified and configured using a specific configuration class, all of which derive from the common [`AdapterConfigBase`](adapters.AdapterConfigBase) class.
Each method is specified and configured using a specific configuration class, all of which derive from the common [`AdapterConfig`](adapters.AdapterConfig) class.
E.g., adding one of the supported adapter methods to an existing model instance follows this scheme:
```python
model.add_adapter("name", config=<ADAPTER_CONFIG>)
Expand Down
2 changes: 1 addition & 1 deletion docs/training.md
Original file line number Diff line number Diff line change
Expand Up @@ -59,7 +59,7 @@ Compared to fine-tuning the entire model, we have to make only one significant a
# task adapter - only add if not existing
if task_name not in model.adapters_config:
# resolve the adapter config
adapter_config = AdapterConfigBase.load(adapter_args.adapter_config)
adapter_config = AdapterConfig.load(adapter_args.adapter_config)
# add a new adapter
model.add_adapter(task_name, config=adapter_config)
# Enable adapter training
Expand Down
6 changes: 6 additions & 0 deletions docs/transitioning.md
Original file line number Diff line number Diff line change
Expand Up @@ -50,6 +50,12 @@ The `adapters` library supports the configuration of adapters using [config stri
For a complete list of config strings and classes see [here](https://docs.adapterhub.ml/overview.html). We strongly recommend using the new config strings, but we will continue to support the old config strings for the time being to make the transition easier.
Note that with the config strings the coresponding adapter config classes have changed, e.g. `PfeifferConfig` -> `SeqBnConfig`.

Another consequence of this that the `AdapterConfig` class is now not only for the bottleneck adapters anymore, but the base class of all the configurations (previously `AdapterConfigBase`). Hence the function this class serves has changed. However, you can still load adapter configs with:
```
adapter_config = AdapterConfig.load("lora")
```


## Features that are not supported by `adapters`

Compared to `adapter-transformers`, there are a few features that are no longer supported by the `adapters` library:
Expand Down
8 changes: 3 additions & 5 deletions examples/pytorch/dependency-parsing/run_udp.py
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@

import adapters
import adapters.composition as ac
from adapters import AdapterArguments, AdapterConfigBase, AutoAdapterModel, setup_adapter_training
from adapters import AdapterArguments, AdapterConfig, AutoAdapterModel, setup_adapter_training
from preprocessing import preprocess_dataset
from transformers import AutoConfig, AutoTokenizer, HfArgumentParser, set_seed
from utils_udp import UD_HEAD_LABELS, DependencyParsingAdapterTrainer, DependencyParsingTrainer, UDTrainingArguments
Expand Down Expand Up @@ -252,7 +252,7 @@ def main():
logger.info("Loading best model for predictions.")

if adapter_args.train_adapter:
adapter_config = AdapterConfigBase.load(adapter_args.adapter_config, **adapter_config_kwargs)
adapter_config = AdapterConfig.load(adapter_args.adapter_config, **adapter_config_kwargs)
model.load_adapter(
os.path.join(training_args.output_dir, "best_model", task_name)
if training_args.do_train
Expand All @@ -262,9 +262,7 @@ def main():
**adapter_load_kwargs,
)
if adapter_args.load_lang_adapter:
lang_adapter_config = AdapterConfigBase.load(
adapter_args.lang_adapter_config, **adapter_config_kwargs
)
lang_adapter_config = AdapterConfig.load(adapter_args.lang_adapter_config, **adapter_config_kwargs)
lang_adapter_name = model.load_adapter(
os.path.join(training_args.output_dir, "best_model", lang_adapter_name)
if training_args.do_train
Expand Down
Loading
Loading