ValueError
warning in BartAdapterModel is unnecessary for sequence labeling task/generative task
#563
Labels
bug
Something isn't working
Information
Model I am using (Bert, XLNet ...): BartAdapterModel
Language I am using the model on (English, Chinese ...): English
Adapter setup I am using (if any): pffeifer
The problem arises when using:
The tasks I am working on is:
To reproduce
Steps to reproduce the behavior:
I checked the source code in bart/adatper_model.py and modeling_bart.py (in transformers) and found this in
BartAdapterModel
'sforward()
function:which is the same with
BartForSequenceClassification
. I think it's not necessary for a sequence labeling task? When doing masked language model, all the input text are concatenated and then grouped by every 1024 tokens, and it's common to include different number of tokens accordingly.This error exists even I add a
seq2seq_lm_head
.Expected behavior
The mentioned
ValueError
should be applied only for classification tasks.The text was updated successfully, but these errors were encountered: