Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update Embedding Documentation #706

Merged
merged 7 commits into from
Jun 24, 2024
Merged
Show file tree
Hide file tree
Changes from 2 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
12 changes: 8 additions & 4 deletions docs/embeddings.md
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
# Embeddings

With `adapters`, we support dynamically adding, loading, and deleting of `Embeddings`. This section
will give you an overview of these features.
will give you an overview of these features. A toy example is illustrated in this [notebook](https://colab.research.google.com/github/Adapter-Hub/adapters/blob/main/notebooks/Adapter_With_Embeddings.ipynb).

## Adding and Deleting Embeddings
The methods for handling embeddings are similar to the ones handling adapters. To add new embeddings we call
Expand All @@ -12,13 +12,12 @@ is currently active, the `active_embeddings` property contains the currently act

```python
model.add_embeddings('name', tokenizer, reference_embedding='default', reference_tokenizer=reference_tokenizer)
embedding_name = model.active_embeddings
```

The original embedding of the transformers model is always available under the name `"default"`. To set it as the active
embedding simply call the `set_active_embedding('name')` method.
```python
model.set_active_embeddings("default")
model.set_active_embeddings('name')
```
Similarly, all other embeddings can be set as active by passing their name to the `set_active_embedding` method.

Expand All @@ -28,7 +27,12 @@ we want to delete. However, you cannot delete the default embedding.
model.delete_embeddings('name')
```
Please note, that if the active embedding is deleted the default embedding is set as the active embedding.

## Training Embeddings
hSterz marked this conversation as resolved.
Show resolved Hide resolved
Embeddings can only be trained with an adapter. To freeze all weights except for the embedding and the adapter:
```python
model.train_adapter('adapter_name', train_embeddings=True)
```
Except for the `train_embeddings` flag, the training is the same as for just training an adapter (see [Adapter Training](training.md)).
hSterz marked this conversation as resolved.
Show resolved Hide resolved
## Saving and Loading Embeddings
You can save the embeddings by calling `save_embeddings('path/to/dir', 'name')` and load them with `load_embeddings('path/to/dir', 'name')`.

Expand Down
Loading