Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Pre-trained model raises IndexError: index out of range in self #85

Open
shivanraptor opened this issue Sep 20, 2024 · 0 comments
Open

Comments

@shivanraptor
Copy link

I tried to test the performance of the pre-trained model using the codes below:

from transformers import BertTokenizer, BartModel
model = BartModel.from_pretrained("/path/to/generated/pretrained/model/")
tokenizer = BertTokenizer.from_pretrained("/path/to/generated/pretrained/model/")

inputs = tokenizer("今天天氣很好", return_tensors="pt")
outputs = model(**inputs)

It raises:

TypeError: BartModel.forward() got an unexpected keyword argument 'token_type_ids'

Then, it uses the following line to remove the token_type_ids and retry:

del(inputs['token_type_ids'])
outputs = model(**inputs)

Then, it shows:

IndexError: index out of range in self

How do I resolve this issue?


Traceback is as follows:

╭─────────────────────────────── Traceback (most recent call last) ────────────────────────────────╮
│ in <module>:1                                                                                    │
│                                                                                                  │
│ /home/jupyter-raptor/.local/lib/python3.10/site-packages/torch/nn/modules/module.py:1553 in      │
│ _wrapped_call_impl                                                                               │
│                                                                                                  │
│   1550 │   │   if self._compiled_call_impl is not None:                                          │
│   1551 │   │   │   return self._compiled_call_impl(*args, **kwargs)  # type: ignore[misc]        │
│   1552 │   │   else:                                                                             │
│ ❱ 1553 │   │   │   return self._call_impl(*args, **kwargs)                                       │
│   1554 │                                                                                         │
│   1555 │   def _call_impl(self, *args, **kwargs):                                                │
│   1556 │   │   forward_call = (self._slow_forward if torch._C._get_tracing_state() else self.fo  │
│                                                                                                  │
│ /home/jupyter-raptor/.local/lib/python3.10/site-packages/torch/nn/modules/module.py:1562 in      │
│ _call_impl                                                                                       │
│                                                                                                  │
│   1559 │   │   if not (self._backward_hooks or self._backward_pre_hooks or self._forward_hooks   │
│   1560 │   │   │   │   or _global_backward_pre_hooks or _global_backward_hooks                   │
│   1561 │   │   │   │   or _global_forward_hooks or _global_forward_pre_hooks):                   │
│ ❱ 1562 │   │   │   return forward_call(*args, **kwargs)                                          │
│   1563 │   │                                                                                     │
│   1564 │   │   try:                                                                              │
│   1565 │   │   │   result = None                                                                 │
│                                                                                                  │
│ /home/jupyter-raptor/.local/lib/python3.10/site-packages/transformers/models/bart/modeling_bart. │
│ py:1222 in forward                                                                               │
│                                                                                                  │
│   1219 │   │   return_dict = return_dict if return_dict is not None else self.config.use_return  │
│   1220 │   │                                                                                     │
│   1221 │   │   if encoder_outputs is None:                                                       │
│ ❱ 1222 │   │   │   encoder_outputs = self.encoder(                                               │
│   1223 │   │   │   │   input_ids=input_ids,                                                      │
│   1224 │   │   │   │   attention_mask=attention_mask,                                            │
│   1225 │   │   │   │   head_mask=head_mask,                                                      │
│                                                                                                  │
│ /home/jupyter-raptor/.local/lib/python3.10/site-packages/torch/nn/modules/module.py:1553 in      │
│ _wrapped_call_impl                                                                               │
│                                                                                                  │
│   1550 │   │   if self._compiled_call_impl is not None:                                          │
│   1551 │   │   │   return self._compiled_call_impl(*args, **kwargs)  # type: ignore[misc]        │
│   1552 │   │   else:                                                                             │
│ ❱ 1553 │   │   │   return self._call_impl(*args, **kwargs)                                       │
│   1554 │                                                                                         │
│   1555 │   def _call_impl(self, *args, **kwargs):                                                │
│   1556 │   │   forward_call = (self._slow_forward if torch._C._get_tracing_state() else self.fo  │
│                                                                                                  │
│ /home/jupyter-raptor/.local/lib/python3.10/site-packages/torch/nn/modules/module.py:1562 in      │
│ _call_impl                                                                                       │
│                                                                                                  │
│   1559 │   │   if not (self._backward_hooks or self._backward_pre_hooks or self._forward_hooks   │
│   1560 │   │   │   │   or _global_backward_pre_hooks or _global_backward_hooks                   │
│   1561 │   │   │   │   or _global_forward_hooks or _global_forward_pre_hooks):                   │
│ ❱ 1562 │   │   │   return forward_call(*args, **kwargs)                                          │
│   1563 │   │                                                                                     │
│   1564 │   │   try:                                                                              │
│   1565 │   │   │   result = None                                                                 │
│                                                                                                  │
│ /home/jupyter-raptor/.local/lib/python3.10/site-packages/transformers/models/bart/modeling_bart. │
│ py:799 in forward                                                                                │
│                                                                                                  │
│    796 │   │   │   raise ValueError("You have to specify either input_ids or inputs_embeds")     │
│    797 │   │                                                                                     │
│    798 │   │   if inputs_embeds is None:                                                         │
│ ❱  799 │   │   │   inputs_embeds = self.embed_tokens(input_ids) * self.embed_scale               │
│    800 │   │                                                                                     │
│    801 │   │   embed_pos = self.embed_positions(input_shape)                                     │
│    802                                                                                           │
│                                                                                                  │
│ /home/jupyter-raptor/.local/lib/python3.10/site-packages/torch/nn/modules/module.py:1553 in      │
│ _wrapped_call_impl                                                                               │
│                                                                                                  │
│   1550 │   │   if self._compiled_call_impl is not None:                                          │
│   1551 │   │   │   return self._compiled_call_impl(*args, **kwargs)  # type: ignore[misc]        │
│   1552 │   │   else:                                                                             │
│ ❱ 1553 │   │   │   return self._call_impl(*args, **kwargs)                                       │
│   1554 │                                                                                         │
│   1555 │   def _call_impl(self, *args, **kwargs):                                                │
│   1556 │   │   forward_call = (self._slow_forward if torch._C._get_tracing_state() else self.fo  │
│                                                                                                  │
│ /home/jupyter-raptor/.local/lib/python3.10/site-packages/torch/nn/modules/module.py:1562 in      │
│ _call_impl                                                                                       │
│                                                                                                  │
│   1559 │   │   if not (self._backward_hooks or self._backward_pre_hooks or self._forward_hooks   │
│   1560 │   │   │   │   or _global_backward_pre_hooks or _global_backward_hooks                   │
│   1561 │   │   │   │   or _global_forward_hooks or _global_forward_pre_hooks):                   │
│ ❱ 1562 │   │   │   return forward_call(*args, **kwargs)                                          │
│   1563 │   │                                                                                     │
│   1564 │   │   try:                                                                              │
│   1565 │   │   │   result = None                                                                 │
│                                                                                                  │
│ /home/jupyter-raptor/.local/lib/python3.10/site-packages/torch/nn/modules/sparse.py:164 in       │
│ forward                                                                                          │
│                                                                                                  │
│   161 │   │   │   │   self.weight[self.padding_idx].fill_(0)                                     │
│   162 │                                                                                          │
│   163 │   def forward(self, input: Tensor) -> Tensor:                                            │
│ ❱ 164 │   │   return F.embedding(                                                                │
│   165 │   │   │   input, self.weight, self.padding_idx, self.max_norm,                           │
│   166 │   │   │   self.norm_type, self.scale_grad_by_freq, self.sparse)                          │
│   167                                                                                            │
│                                                                                                  │
│ /home/jupyter-raptor/.local/lib/python3.10/site-packages/torch/nn/functional.py:2267 in          │
│ embedding                                                                                        │
│                                                                                                  │
│   2264 │   │   #   torch.embedding_renorm_                                                       │
│   2265 │   │   # remove once script supports set_grad_enabled                                    │
│   2266 │   │   _no_grad_embedding_renorm_(weight, input, max_norm, norm_type)                    │
│ ❱ 2267 │   return torch.embedding(weight, input, padding_idx, scale_grad_by_freq, sparse)        │
│   2268                                                                                           │
│   2269                                                                                           │
│   2270 def embedding_bag(                                                                        │
╰──────────────────────────────────────────────────────────────────────────────────────────────────╯
IndexError: index out of range in self
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant