Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Root directory not created when calling .save_all_adapter #361

Closed
1 of 4 tasks
eugene-yang opened this issue Jun 7, 2022 · 1 comment · Fixed by #375
Closed
1 of 4 tasks

Root directory not created when calling .save_all_adapter #361

eugene-yang opened this issue Jun 7, 2022 · 1 comment · Fixed by #375
Labels
bug Something isn't working

Comments

@eugene-yang
Copy link

Environment info

  • adapter-transformers version: 3.0.1
  • Platform: Arch Linux
  • Python version: 3.10
  • PyTorch version (GPU?):
  • Tensorflow version (GPU?):
  • Using GPU in script?:
  • Using distributed or parallel set-up in script?:

Information

Model I am using (Bert, XLNet ...): XLMR

Language I am using the model on (English, Chinese ...): English

Adapter setup I am using (if any):

The problem arises when using:

  • the official example scripts: (give details below)
  • my own modified scripts: (give details below)

The tasks I am working on is:

  • an official GLUE/SQUaD task: (give the name)
  • my own task or dataset: (give details below)

To reproduce

Steps to reproduce the behavior:

from transformers import AutoConfig, AutoAdapterModel, AdapterConfig
model = AutoAdapterModel.from_pretrained('xlm-roberta-base')
config = AdapterConfig.load("pfeiffer", non_linearity="relu", reduction_factor=2)
model.load_adapter('en/wiki@ukp', config=config)
model.save_all_adapters('./test_save_adapters/') 

Trace

---------------------------------------------------------------------------
FileNotFoundError                         Traceback (most recent call last)
Input In [1], in <cell line: 5>()
      3 config = AdapterConfig.load("pfeiffer", non_linearity="relu", reduction_factor=2)
      4 model.load_adapter('en/wiki@ukp', config=config)
----> 5 model.save_all_adapters('./test_save_adapters/')

File ~/.conda/envs/adapter/lib/python3.10/site-packages/transformers/adapters/model_mixin.py:934, in ModelWithHeadsAdaptersMixin.save_all_adapters(self, save_directory, with_head, meta_dict, custom_weights_loaders)
    932 else:
    933     meta_dict = {"config_id": h}
--> 934 self.save_adapter(
    935     save_path,
    936     name,
    937     meta_dict=meta_dict,
    938     with_head=with_head,
    939     custom_weights_loaders=custom_weights_loaders,
    940 )

File ~/.conda/envs/adapter/lib/python3.10/site-packages/transformers/adapters/model_mixin.py:869, in ModelWithHeadsAdaptersMixin.save_adapter(self, save_directory, adapter_name, with_head, meta_dict, custom_weights_loaders)
    867         custom_weights_loaders = []
    868     custom_weights_loaders.append(PredictionHeadLoader(self, error_on_missing=False))
--> 869 super().save_adapter(
    870     save_directory,
    871     adapter_name,
    872     meta_dict=meta_dict,
    873     custom_weights_loaders=custom_weights_loaders,
    874 )

File ~/.conda/envs/adapter/lib/python3.10/site-packages/transformers/adapters/model_mixin.py:395, in ModelAdaptersMixin.save_adapter(self, save_directory, adapter_name, meta_dict, custom_weights_loaders)
    383 """
    384 Saves an adapter and its configuration file to a directory so that it can be shared or reloaded using
    385 `load_adapter()`.
   (...)
    392     ValueError: If the given adapter name is invalid.
    393 """
    394 loader = AdapterLoader(self)
--> 395 loader.save(save_directory, adapter_name, meta_dict)
    396 # save additional custom weights
    397 if custom_weights_loaders:

File ~/.conda/envs/adapter/lib/python3.10/site-packages/transformers/adapters/loading.py:360, in AdapterLoader.save(self, save_directory, name, meta_dict)
    351 """
    352 Saves an adapter and its configuration file to a directory, so that it can be reloaded using the `load()`
    353 method.
   (...)
    357     task_name (str): the name of the adapter to be saved
    358 """
    359 if not exists(save_directory):
--> 360     mkdir(save_directory)
    361 else:
    362     assert isdir(
    363         save_directory
    364     ), "Saving path should be a directory where adapter and configuration can be saved."

FileNotFoundError: [Errno 2] No such file or directory: './test_save_adapters/en'

This applys to .save_adapter_fusions as well.

Expected behavior

Should create the root directory first before saving each adapter.
https://github.com/adapter-hub/adapter-transformers/blob/master/src/transformers/adapters/model_mixin.py#L952

@eugene-yang eugene-yang added the bug Something isn't working label Jun 7, 2022
@eugene-yang eugene-yang changed the title Subdirectory not created when calling .save_all_adapter Root directory not created when calling .save_all_adapter Jun 7, 2022
@calpt
Copy link
Member

calpt commented Jun 29, 2022

Thanks for bringing this up, this issue should be fixed with #375.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants