Skip to content

Commit

Permalink
Revert "Fix typing annotations for FSDP and DeepSpeed in TrainingArgu…
Browse files Browse the repository at this point in the history
…ments" (#24574)

Revert "Fix typing annotations for FSDP and DeepSpeed in TrainingArguments (#24549)"

This reverts commit c5e29d4.
  • Loading branch information
sgugger authored Jun 29, 2023
1 parent 4f1b31c commit 2dc5e1a
Showing 1 changed file with 5 additions and 5 deletions.
10 changes: 5 additions & 5 deletions src/transformers/training_args.py
Original file line number Diff line number Diff line change
Expand Up @@ -976,12 +976,12 @@ class TrainingArguments:
)
},
)
fsdp_config: Optional[Union[str, Dict]] = field(
fsdp_config: Optional[str] = field(
default=None,
metadata={
"help": (
"Config to be used with FSDP (Pytorch Fully Sharded Data Parallel). The value is either a"
"fsdp json config file (e.g., `fsdp_config.json`) or an already loaded json file as `dict`."
"Config to be used with FSDP (Pytorch Fully Sharded Data Parallel). The value is either a"
"fsdp json config file (e.g., `fsdp_config.json`) or an already loaded json file as `dict`."
)
},
)
Expand All @@ -994,11 +994,11 @@ class TrainingArguments:
)
},
)
deepspeed: Optional[Union[str, Dict]] = field(
deepspeed: Optional[str] = field(
default=None,
metadata={
"help": (
"Enable deepspeed and pass the path to deepspeed json config file (e.g. `ds_config.json`) or an already"
"Enable deepspeed and pass the path to deepspeed json config file (e.g. ds_config.json) or an already"
" loaded json file as a dict"
)
},
Expand Down

0 comments on commit 2dc5e1a

Please sign in to comment.