From bdf89efd113944f0b00ca6733b8acb905022471e Mon Sep 17 00:00:00 2001 From: yinon Date: Fri, 4 Aug 2023 14:42:28 +0000 Subject: [PATCH] pytorch - improve docs --- docs/freqai-parameter-table.md | 12 ++++++------ 1 file changed, 6 insertions(+), 6 deletions(-) diff --git a/docs/freqai-parameter-table.md b/docs/freqai-parameter-table.md index de0b666ca..95687c7ab 100644 --- a/docs/freqai-parameter-table.md +++ b/docs/freqai-parameter-table.md @@ -100,12 +100,12 @@ Mandatory parameters are marked as **Required** and have to be set in one of the #### trainer_kwargs -| Parameter | Description | -|----------------------|-------------| -| | **Model training parameters within the `freqai.model_training_parameters.model_kwargs` sub dictionary** -| `n_epochs` | The `n_epochs` parameter is a crucial setting in the PyTorch training loop that determines the number of times the entire training dataset will be used to update the model's parameters. An epoch represents one full pass through the entire training dataset.
**Datatype:** int.
Default: `10`. -| `n_steps` | An alternative way of setting `n_epochs` - the number of training iterations to run. Iteration here refer to the number of times we call `optimizer.step()`. a simplified version of the function:

n_epochs = n_steps / (n_obs / batch_size)

The motivation here is that `n_steps` is easier to optimize and keep stable across different n_obs - the number of data points.

**Datatype:** int. optional.
Default: `None`. -| `batch_size` | The size of the batches to use during training..
**Datatype:** int.
Default: `64`. +| Parameter | Description | +|--------------|-------------| +| | **Model training parameters within the `freqai.model_training_parameters.model_kwargs` sub dictionary** +| `n_epochs` | The `n_epochs` parameter is a crucial setting in the PyTorch training loop that determines the number of times the entire training dataset will be used to update the model's parameters. An epoch represents one full pass through the entire training dataset. Overrides `n_steps`. Either `n_epochs` or `n_steps` must be set.

**Datatype:** int. optional.
Default: `10`. +| `n_steps` | An alternative way of setting `n_epochs` - the number of training iterations to run. Iteration here refer to the number of times we call `optimizer.step()`. Ignored if `n_epochs` is set. A simplified version of the function:

n_epochs = n_steps / (n_obs / batch_size)

The motivation here is that `n_steps` is easier to optimize and keep stable across different n_obs - the number of data points.

**Datatype:** int. optional.
Default: `None`. +| `batch_size` | The size of the batches to use during training.

**Datatype:** int.
Default: `64`. ### Additional parameters