Skip to content

Commit

Permalink
remove_layer_mentions
Browse files Browse the repository at this point in the history
  • Loading branch information
elephaint committed Sep 24, 2024
1 parent a7b6e73 commit 7bc1b5d
Show file tree
Hide file tree
Showing 4 changed files with 6 additions and 6 deletions.
2 changes: 1 addition & 1 deletion nbs/docs/capabilities/forecast/07_finetuning.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -146,7 +146,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"By default, only the last layer of the model is finetuned. We can finetune more layers using `finetune_depth` parameter. "
"By default, only a small amount of finetuning is applied (`finetune_depth=1`). We can increase the intensity of finetuning by increasing the `finetune_depth` parameter. Note that increasing `finetune_depth` and `finetune_steps` increases wall time for generating predictions."
]
},
{
Expand Down
2 changes: 1 addition & 1 deletion nbs/docs/tutorials/06_finetuning.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -326,7 +326,7 @@
"### 3.1 Control the level of fine-tuning with `finetune_depth`\n",
"It is also possible to control the depth of fine-tuning with the `finetune_depth` parameter.\n",
"\n",
"`finetune_depth` takes values among `[1, 2, 3, 4, 5]`. By default, it is set to 1, which means that fine-tuning is shallow, whereas a value of 5 means the maximum amount of fine-tuning is done. Increasing `finetune_depth` increases the time to generate predictions."
"`finetune_depth` takes values among `[1, 2, 3, 4, 5]`. By default, it is set to 1, which means a small amount of fine-tuning is applied, whereas a value of 5 means the maximum amount of fine-tuning is done. Increasing `finetune_depth` increases the time to generate predictions."
]
},
{
Expand Down
4 changes: 2 additions & 2 deletions nbs/src/nixtla_client.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -1392,8 +1392,8 @@
" Number of steps used to finetune TimeGPT in the\n",
" new data.\n",
" finetune_depth: int (default=1)\n",
" Depth of finetuning, on a scale from 1 to 5. 1 means only the last layer is finetuned\n",
" and 5 means the whole model if finetuned.\n",
" The depth of the finetuning. Uses a scale from 1 to 5, where 1 means little finetuning,\n",
" and 5 means that the entire model is finetuned. By default, the value is set to 1.\n",
" finetune_loss : str (default='default')\n",
" Loss function to use for finetuning. Options are: `default`, `mae`, `mse`, `rmse`, `mape`, and `smape`.\n",
" clean_ex_first : bool (default=True)\n",
Expand Down
4 changes: 2 additions & 2 deletions nixtla/nixtla_client.py
Original file line number Diff line number Diff line change
Expand Up @@ -1344,8 +1344,8 @@ def cross_validation(
Number of steps used to finetune TimeGPT in the
new data.
finetune_depth: int (default=1)
Depth of finetuning, on a scale from 1 to 5. 1 means only the last layer is finetuned
and 5 means the whole model if finetuned.
The depth of the finetuning. Uses a scale from 1 to 5, where 1 means little finetuning,
and 5 means that the entire model is finetuned. By default, the value is set to 1.
finetune_loss : str (default='default')
Loss function to use for finetuning. Options are: `default`, `mae`, `mse`, `rmse`, `mape`, and `smape`.
clean_ex_first : bool (default=True)
Expand Down

0 comments on commit 7bc1b5d

Please sign in to comment.