Deployed 2a0dd4c to develop in en with MkDocs 1.6.1 and mike 2.1.3

This commit is contained in:
github-actions[bot]
2025-05-09 07:53:41 +00:00
parent 6e806f4313
commit 5a7412daff
6 changed files with 122 additions and 127 deletions

View File

@@ -2516,9 +2516,9 @@
<h1 id="hyperopt">Hyperopt<a class="headerlink" href="#hyperopt" title="Permanent link">&para;</a></h1>
<p>This page explains how to tune your strategy by finding the optimal
parameters, a process called hyperparameter optimization. The bot uses algorithms included in the <code>scikit-optimize</code> package to accomplish this.
parameters, a process called hyperparameter optimization. The bot uses algorithms included in the <code>optuna</code> package to accomplish this.
The search will burn all your CPU cores, make your laptop sound like a fighter jet and still take a long time.</p>
<p>In general, the search for best parameters starts with a few random combinations (see <a href="#reproducible-results">below</a> for more details) and then uses Bayesian search with a ML regressor algorithm (currently ExtraTreesRegressor) to quickly find a combination of parameters in the search hyperspace that minimizes the value of the <a href="#loss-functions">loss function</a>.</p>
<p>In general, the search for best parameters starts with a few random combinations (see <a href="#reproducible-results">below</a> for more details) and then uses one of optuna's sampler algorithms (currently NSGAIIISampler) to quickly find a combination of parameters in the search hyperspace that minimizes the value of the <a href="#loss-functions">loss function</a>.</p>
<p>Hyperopt requires historic data to be available, just as backtesting does (hyperopt runs backtesting many times with different parameters).
To learn how to get data for the pairs and exchange you're interested in, head over to the <a href="../data-download/">Data Downloading</a> section of the documentation.</p>
<div class="admonition bug">