Skip to content

Commit

Permalink
Remove a redundant variable. (#27288)
Browse files Browse the repository at this point in the history
* Removed the redundant SiLUActivation class and now use nn.functional.silu directly.

* I apologize for adding torch.functional.silu. I have replaced it with nn.SiLU.

* Remove redundant variable in feature_extraction file
  • Loading branch information
hi-sushanta authored Nov 7, 2023
1 parent 88832c0 commit 9459d82
Showing 1 changed file with 1 addition and 2 deletions.
3 changes: 1 addition & 2 deletions src/transformers/pipelines/feature_extraction.py
Original file line number Diff line number Diff line change
Expand Up @@ -77,8 +77,7 @@ def _sanitize_parameters(self, truncation=None, tokenize_kwargs=None, return_ten
return preprocess_params, {}, postprocess_params

def preprocess(self, inputs, **tokenize_kwargs) -> Dict[str, GenericTensor]:
return_tensors = self.framework
model_inputs = self.tokenizer(inputs, return_tensors=return_tensors, **tokenize_kwargs)
model_inputs = self.tokenizer(inputs, return_tensors=self.framework, **tokenize_kwargs)
return model_inputs

def _forward(self, model_inputs):
Expand Down

0 comments on commit 9459d82

Please sign in to comment.