Weird condition for sequence classification in Bart #494
Labels
bug
Something isn't working
do-not-stale
This issue won't be automatically staled and closed after 90 days
Environment info
adapter-transformers
version: 3.1.0Information
Model I am using (Bert, XLNet ...): Bart
I think checking condition for sequence classification by shape is quite buggy.
I wanted to use LM head with same enc & dec seq_len (and, of course, I could have different number of eos tokens), but conditional code below make it difficult to do that.
I work around with setting dec_seq_len == enc_seq_len-1, but it should be fixed for potential bug.
https://github.com/adapter-hub/adapter-transformers/blob/master/src/transformers/adapters/models/bart/adapter_model.py#L96
Expected behavior
The text was updated successfully, but these errors were encountered: