mirror of
https://github.com/freqtrade/freqtrade.git
synced 2026-01-26 00:40:23 +00:00
Merge remote-tracking branch 'upstream/develop' into feature/fetch-public-trades
This commit is contained in:
14
.github/workflows/ci.yml
vendored
14
.github/workflows/ci.yml
vendored
@@ -124,8 +124,11 @@ jobs:
|
||||
runs-on: ${{ matrix.os }}
|
||||
strategy:
|
||||
matrix:
|
||||
os: [ "macos-latest", "macos-13" ]
|
||||
os: [ "macos-latest", "macos-13", "macos-14" ]
|
||||
python-version: ["3.9", "3.10", "3.11", "3.12"]
|
||||
exclude:
|
||||
- os: "macos-14"
|
||||
python-version: "3.9"
|
||||
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
@@ -154,7 +157,7 @@ jobs:
|
||||
run: |
|
||||
cd build_helpers && ./install_ta-lib.sh ${HOME}/dependencies/; cd ..
|
||||
|
||||
- name: Installation - macOS
|
||||
- name: Installation - macOS (Brew)
|
||||
run: |
|
||||
# brew update
|
||||
# TODO: Should be the brew upgrade
|
||||
@@ -177,6 +180,9 @@ jobs:
|
||||
rm /usr/local/bin/python3.12-config || true
|
||||
|
||||
brew install hdf5 c-blosc libomp
|
||||
|
||||
- name: Installation (python)
|
||||
run: |
|
||||
python -m pip install --upgrade pip wheel
|
||||
export LD_LIBRARY_PATH=${HOME}/dependencies/lib:$LD_LIBRARY_PATH
|
||||
export TA_LIBRARY_PATH=${HOME}/dependencies/lib
|
||||
@@ -482,12 +488,12 @@ jobs:
|
||||
path: dist
|
||||
|
||||
- name: Publish to PyPI (Test)
|
||||
uses: pypa/gh-action-pypi-publish@v1.8.12
|
||||
uses: pypa/gh-action-pypi-publish@v1.8.14
|
||||
with:
|
||||
repository-url: https://test.pypi.org/legacy/
|
||||
|
||||
- name: Publish to PyPI
|
||||
uses: pypa/gh-action-pypi-publish@v1.8.12
|
||||
uses: pypa/gh-action-pypi-publish@v1.8.14
|
||||
|
||||
|
||||
deploy-docker:
|
||||
|
||||
@@ -16,9 +16,9 @@ repos:
|
||||
additional_dependencies:
|
||||
- types-cachetools==5.3.0.7
|
||||
- types-filelock==3.2.7
|
||||
- types-requests==2.31.0.20240218
|
||||
- types-requests==2.31.0.20240311
|
||||
- types-tabulate==0.9.0.20240106
|
||||
- types-python-dateutil==2.8.19.20240106
|
||||
- types-python-dateutil==2.8.19.20240311
|
||||
- SQLAlchemy==2.0.27
|
||||
# stages: [push]
|
||||
|
||||
@@ -31,7 +31,7 @@ repos:
|
||||
|
||||
- repo: https://github.com/charliermarsh/ruff-pre-commit
|
||||
# Ruff version.
|
||||
rev: 'v0.2.2'
|
||||
rev: 'v0.3.0'
|
||||
hooks:
|
||||
- id: ruff
|
||||
|
||||
|
||||
@@ -109,12 +109,12 @@ automatically accessible by including them on the indicator-list, and these incl
|
||||
- **open_date :** trade open datetime
|
||||
- **close_date :** trade close datetime
|
||||
- **min_rate :** minimum price seen throughout the position
|
||||
- **max_rate :** maxiumum price seen throughout the position
|
||||
- **max_rate :** maximum price seen throughout the position
|
||||
- **open :** signal candle open price
|
||||
- **close :** signal candle close price
|
||||
- **high :** signal candle high price
|
||||
- **low :** signal candle low price
|
||||
- **volume :** signal candle volumne
|
||||
- **volume :** signal candle volume
|
||||
- **profit_ratio :** trade profit ratio
|
||||
- **profit_abs :** absolute profit return of the trade
|
||||
|
||||
|
||||
@@ -75,7 +75,7 @@ Mandatory parameters are marked as **Required** and have to be set in one of the
|
||||
| `rl_config` | A dictionary containing the control parameters for a Reinforcement Learning model. <br> **Datatype:** Dictionary.
|
||||
| `train_cycles` | Training time steps will be set based on the `train_cycles * number of training data points. <br> **Datatype:** Integer.
|
||||
| `max_trade_duration_candles`| Guides the agent training to keep trades below desired length. Example usage shown in `prediction_models/ReinforcementLearner.py` within the customizable `calculate_reward()` function. <br> **Datatype:** int.
|
||||
| `model_type` | Model string from stable_baselines3 or SBcontrib. Available strings include: `'TRPO', 'ARS', 'RecurrentPPO', 'MaskablePPO', 'PPO', 'A2C', 'DQN'`. User should ensure that `model_training_parameters` match those available to the corresponding stable_baselines3 model by visiting their documentaiton. [PPO doc](https://stable-baselines3.readthedocs.io/en/master/modules/ppo.html) (external website) <br> **Datatype:** string.
|
||||
| `model_type` | Model string from stable_baselines3 or SBcontrib. Available strings include: `'TRPO', 'ARS', 'RecurrentPPO', 'MaskablePPO', 'PPO', 'A2C', 'DQN'`. User should ensure that `model_training_parameters` match those available to the corresponding stable_baselines3 model by visiting their documentation. [PPO doc](https://stable-baselines3.readthedocs.io/en/master/modules/ppo.html) (external website) <br> **Datatype:** string.
|
||||
| `policy_type` | One of the available policy types from stable_baselines3 <br> **Datatype:** string.
|
||||
| `max_training_drawdown_pct` | The maximum drawdown that the agent is allowed to experience during training. <br> **Datatype:** float. <br> Default: 0.8
|
||||
| `cpu_count` | Number of threads/cpus to dedicate to the Reinforcement Learning training process (depending on if `ReinforcementLearning_multiproc` is selected or not). Recommended to leave this untouched, by default, this value is set to the total number of physical cores minus 1. <br> **Datatype:** int.
|
||||
|
||||
@@ -142,7 +142,7 @@ Parameter details can be found [here](freqai-parameter-table.md), but in general
|
||||
As you begin to modify the strategy and the prediction model, you will quickly realize some important differences between the Reinforcement Learner and the Regressors/Classifiers. Firstly, the strategy does not set a target value (no labels!). Instead, you set the `calculate_reward()` function inside the `MyRLEnv` class (see below). A default `calculate_reward()` is provided inside `prediction_models/ReinforcementLearner.py` to demonstrate the necessary building blocks for creating rewards, but this is *not* designed for production. Users *must* create their own custom reinforcement learning model class or use a pre-built one from outside the Freqtrade source code and save it to `user_data/freqaimodels`. It is inside the `calculate_reward()` where creative theories about the market can be expressed. For example, you can reward your agent when it makes a winning trade, and penalize the agent when it makes a losing trade. Or perhaps, you wish to reward the agent for entering trades, and penalize the agent for sitting in trades too long. Below we show examples of how these rewards are all calculated:
|
||||
|
||||
!!! note "Hint"
|
||||
The best reward functions are ones that are continuously differentiable, and well scaled. In other words, adding a single large negative penalty to a rare event is not a good idea, and the neural net will not be able to learn that function. Instead, it is better to add a small negative penalty to a common event. This will help the agent learn faster. Not only this, but you can help improve the continuity of your rewards/penalties by having them scale with severity according to some linear/exponential functions. In other words, you'd slowly scale the penalty as the duration of the trade increases. This is better than a single large penalty occuring at a single point in time.
|
||||
The best reward functions are ones that are continuously differentiable, and well scaled. In other words, adding a single large negative penalty to a rare event is not a good idea, and the neural net will not be able to learn that function. Instead, it is better to add a small negative penalty to a common event. This will help the agent learn faster. Not only this, but you can help improve the continuity of your rewards/penalties by having them scale with severity according to some linear/exponential functions. In other words, you'd slowly scale the penalty as the duration of the trade increases. This is better than a single large penalty occurring at a single point in time.
|
||||
|
||||
```python
|
||||
from freqtrade.freqai.prediction_models.ReinforcementLearner import ReinforcementLearner
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
markdown==3.5.2
|
||||
mkdocs==1.5.3
|
||||
mkdocs-material==9.5.12
|
||||
mkdocs-material==9.5.13
|
||||
mdx_truly_sane_lists==1.3
|
||||
pymdown-extensions==10.7
|
||||
pymdown-extensions==10.7.1
|
||||
jinja2==3.1.3
|
||||
|
||||
@@ -11,34 +11,129 @@ The call sequence of the methods described here is covered under [bot execution
|
||||
!!! Tip
|
||||
Start off with a strategy template containing all available callback methods by running `freqtrade new-strategy --strategy MyAwesomeStrategy --template advanced`
|
||||
|
||||
## Storing information
|
||||
## Storing information (Persistent)
|
||||
|
||||
Storing information can be accomplished by creating a new dictionary within the strategy class.
|
||||
Freqtrade allows storing/retrieving user custom information associated with a specific trade in the database.
|
||||
|
||||
The name of the variable can be chosen at will, but should be prefixed with `custom_` to avoid naming collisions with predefined strategy variables.
|
||||
Using a trade object, information can be stored using `trade.set_custom_data(key='my_key', value=my_value)` and retrieved using `trade.get_custom_data(key='my_key')`. Each data entry is associated with a trade and a user supplied key (of type `string`). This means that this can only be used in callbacks that also provide a trade object.
|
||||
|
||||
For the data to be able to be stored within the database, freqtrade must serialized the data. This is done by converting the data to a JSON formatted string.
|
||||
Freqtrade will attempt to reverse this action on retrieval, so from a strategy perspective, this should not be relevant.
|
||||
|
||||
```python
|
||||
from freqtrade.persistence import Trade
|
||||
from datetime import timedelta
|
||||
|
||||
class AwesomeStrategy(IStrategy):
|
||||
# Create custom dictionary
|
||||
custom_info = {}
|
||||
|
||||
def populate_indicators(self, dataframe: DataFrame, metadata: dict) -> DataFrame:
|
||||
# Check if the entry already exists
|
||||
if not metadata["pair"] in self.custom_info:
|
||||
# Create empty entry for this pair
|
||||
self.custom_info[metadata["pair"]] = {}
|
||||
def bot_loop_start(self, **kwargs) -> None:
|
||||
for trade in Trade.get_open_order_trades():
|
||||
fills = trade.select_filled_orders(trade.entry_side)
|
||||
if trade.pair == 'ETH/USDT':
|
||||
trade_entry_type = trade.get_custom_data(key='entry_type')
|
||||
if trade_entry_type is None:
|
||||
trade_entry_type = 'breakout' if 'entry_1' in trade.enter_tag else 'dip'
|
||||
elif fills > 1:
|
||||
trade_entry_type = 'buy_up'
|
||||
trade.set_custom_data(key='entry_type', value=trade_entry_type)
|
||||
return super().bot_loop_start(**kwargs)
|
||||
|
||||
if "crosstime" in self.custom_info[metadata["pair"]]:
|
||||
self.custom_info[metadata["pair"]]["crosstime"] += 1
|
||||
else:
|
||||
self.custom_info[metadata["pair"]]["crosstime"] = 1
|
||||
def adjust_entry_price(self, trade: Trade, order: Optional[Order], pair: str,
|
||||
current_time: datetime, proposed_rate: float, current_order_rate: float,
|
||||
entry_tag: Optional[str], side: str, **kwargs) -> float:
|
||||
# Limit orders to use and follow SMA200 as price target for the first 10 minutes since entry trigger for BTC/USDT pair.
|
||||
if (
|
||||
pair == 'BTC/USDT'
|
||||
and entry_tag == 'long_sma200'
|
||||
and side == 'long'
|
||||
and (current_time - timedelta(minutes=10)) > trade.open_date_utc
|
||||
and order.filled == 0.0
|
||||
):
|
||||
dataframe, _ = self.dp.get_analyzed_dataframe(pair=pair, timeframe=self.timeframe)
|
||||
current_candle = dataframe.iloc[-1].squeeze()
|
||||
# store information about entry adjustment
|
||||
existing_count = trade.get_custom_data('num_entry_adjustments', default=0)
|
||||
if not existing_count:
|
||||
existing_count = 1
|
||||
else:
|
||||
existing_count += 1
|
||||
trade.set_custom_data(key='num_entry_adjustments', value=existing_count)
|
||||
|
||||
# adjust order price
|
||||
return current_candle['sma_200']
|
||||
|
||||
# default: maintain existing order
|
||||
return current_order_rate
|
||||
|
||||
def custom_exit(self, pair: str, trade: Trade, current_time: datetime, current_rate: float, current_profit: float, **kwargs):
|
||||
|
||||
entry_adjustment_count = trade.get_custom_data(key='num_entry_adjustments')
|
||||
trade_entry_type = trade.get_custom_data(key='entry_type')
|
||||
if entry_adjustment_count is None:
|
||||
if current_profit > 0.01 and (current_time - timedelta(minutes=100) > trade.open_date_utc):
|
||||
return True, 'exit_1'
|
||||
else
|
||||
if entry_adjustment_count > 0 and if current_profit > 0.05:
|
||||
return True, 'exit_2'
|
||||
if trade_entry_type == 'breakout' and current_profit > 0.1:
|
||||
return True, 'exit_3
|
||||
|
||||
return False, None
|
||||
```
|
||||
|
||||
!!! Warning
|
||||
The data is not persisted after a bot-restart (or config-reload). Also, the amount of data should be kept smallish (no DataFrames and such), otherwise the bot will start to consume a lot of memory and eventually run out of memory and crash.
|
||||
The above is a simple example - there are simpler ways to retrieve trade data like entry-adjustments.
|
||||
|
||||
!!! Note
|
||||
If the data is pair-specific, make sure to use pair as one of the keys in the dictionary.
|
||||
It is recommended that simple data types are used `[bool, int, float, str]` to ensure no issues when serializing the data that needs to be stored.
|
||||
Storing big junks of data may lead to unintended side-effects, like a database becoming big (and as a consequence, also slow).
|
||||
|
||||
!!! Warning "Non-serializable data"
|
||||
If supplied data cannot be serialized a warning is logged and the entry for the specified `key` will contain `None` as data.
|
||||
|
||||
??? Note "All attributes"
|
||||
custom-data has the following accessors through the Trade object (assumed as `trade` below):
|
||||
|
||||
* `trade.get_custom_data(key='something', default=0)` - Returns the actual value given in the type provided.
|
||||
* `trade.get_custom_data_entry(key='something')` - Returns the entry - including metadata. The value is accessible via `.value` property.
|
||||
* `trade.set_custom_data(key='something', value={'some': 'value'})` - set or update the corresponding key for this trade. Value must be serializable - and we recommend to keep the stored data relatively small.
|
||||
|
||||
"value" can be any type (both in setting and receiving) - but must be json serializable.
|
||||
|
||||
## Storing information (Non-Persistent)
|
||||
|
||||
!!! Warning "Deprecated"
|
||||
This method of storing information is deprecated and we do advise against using non-persistent storage.
|
||||
Please use [Persistent Storage](#storing-information-persistent) instead.
|
||||
|
||||
It's content has therefore been collapsed.
|
||||
|
||||
??? Abstract "Storing information"
|
||||
Storing information can be accomplished by creating a new dictionary within the strategy class.
|
||||
|
||||
The name of the variable can be chosen at will, but should be prefixed with `custom_` to avoid naming collisions with predefined strategy variables.
|
||||
|
||||
```python
|
||||
class AwesomeStrategy(IStrategy):
|
||||
# Create custom dictionary
|
||||
custom_info = {}
|
||||
|
||||
def populate_indicators(self, dataframe: DataFrame, metadata: dict) -> DataFrame:
|
||||
# Check if the entry already exists
|
||||
if not metadata["pair"] in self.custom_info:
|
||||
# Create empty entry for this pair
|
||||
self.custom_info[metadata["pair"]] = {}
|
||||
|
||||
if "crosstime" in self.custom_info[metadata["pair"]]:
|
||||
self.custom_info[metadata["pair"]]["crosstime"] += 1
|
||||
else:
|
||||
self.custom_info[metadata["pair"]]["crosstime"] = 1
|
||||
```
|
||||
|
||||
!!! Warning
|
||||
The data is not persisted after a bot-restart (or config-reload). Also, the amount of data should be kept smallish (no DataFrames and such), otherwise the bot will start to consume a lot of memory and eventually run out of memory and crash.
|
||||
|
||||
!!! Note
|
||||
If the data is pair-specific, make sure to use pair as one of the keys in the dictionary.
|
||||
|
||||
## Dataframe access
|
||||
|
||||
|
||||
@@ -19,7 +19,7 @@ from pathlib import Path
|
||||
project_root = "somedir/freqtrade"
|
||||
i=0
|
||||
try:
|
||||
os.chdirdir(project_root)
|
||||
os.chdir(project_root)
|
||||
assert Path('LICENSE').is_file()
|
||||
except:
|
||||
while i<4 and (not Path('LICENSE').is_file()):
|
||||
|
||||
@@ -59,7 +59,7 @@ For the Freqtrade configuration, you can then use the the full value (including
|
||||
"chat_id": "-1001332619709"
|
||||
```
|
||||
!!! Warning "Using telegram groups"
|
||||
When using telegram groups, you're giving every member of the telegram group access to your freqtrade bot and to all commands possible via telegram. Please make sure that you can trust everyone in the telegram group to avoid unpleasent surprises.
|
||||
When using telegram groups, you're giving every member of the telegram group access to your freqtrade bot and to all commands possible via telegram. Please make sure that you can trust everyone in the telegram group to avoid unpleasant surprises.
|
||||
|
||||
## Control telegram noise
|
||||
|
||||
@@ -181,6 +181,7 @@ official commands. You can ask at any moment for help with `/help`.
|
||||
| `/locks` | Show currently locked pairs.
|
||||
| `/unlock <pair or lock_id>` | Remove the lock for this pair (or for this lock id).
|
||||
| `/marketdir [long | short | even | none]` | Updates the user managed variable that represents the current market direction. If no direction is provided, the currently set direction will be displayed.
|
||||
| `/list_custom_data <trade_id> [key]` | List custom_data for Trade ID & Key combination. If no Key is supplied it will list all key-value pairs found for that Trade ID.
|
||||
| **Modify Trade states** |
|
||||
| `/forceexit <trade_id> | /fx <tradeid>` | Instantly exits the given trade (Ignoring `minimum_roi`).
|
||||
| `/forceexit all | /fx all` | Instantly exits all open trades (Ignoring `minimum_roi`).
|
||||
|
||||
@@ -65,7 +65,7 @@ You can set the POST body format to Form-Encoded (default), JSON-Encoded, or raw
|
||||
|
||||
The result would be a POST request with e.g. `{"text":"Status: running"}` body and `Content-Type: application/json` header which results `Status: running` message in the Mattermost channel.
|
||||
|
||||
When using the Form-Encoded or JSON-Encoded configuration you can configure any number of payload values, and both the key and value will be ouput in the POST request. However, when using the raw data format you can only configure one value and it **must** be named `"data"`. In this instance the data key will not be output in the POST request, only the value. For example:
|
||||
When using the Form-Encoded or JSON-Encoded configuration you can configure any number of payload values, and both the key and value will be output in the POST request. However, when using the raw data format you can only configure one value and it **must** be named `"data"`. In this instance the data key will not be output in the POST request, only the value. For example:
|
||||
|
||||
```json
|
||||
"webhook": {
|
||||
|
||||
@@ -680,7 +680,7 @@ class Exchange:
|
||||
|
||||
candle_limit = self.ohlcv_candle_limit(
|
||||
timeframe, self._config['candle_type_def'],
|
||||
int(date_minus_candles(timeframe, startup_candles).timestamp() * 1000)
|
||||
dt_ts(date_minus_candles(timeframe, startup_candles))
|
||||
if timeframe else None)
|
||||
# Require one more candle - to account for the still open candle.
|
||||
candle_count = startup_candles + 1
|
||||
@@ -2067,7 +2067,11 @@ class Exchange:
|
||||
|
||||
if (not since_ms and (self._ft_has["ohlcv_require_since"] or not_all_data)):
|
||||
# Multiple calls for one pair - to get more history
|
||||
since_ms = self.needed_candle_ms(timeframe, candle_type)
|
||||
one_call = timeframe_to_msecs(timeframe) * self.ohlcv_candle_limit(
|
||||
timeframe, candle_type, since_ms)
|
||||
move_to = one_call * self.required_candle_call_count
|
||||
now = timeframe_to_next_date(timeframe)
|
||||
since_ms = dt_ts(now - timedelta(seconds=move_to // 1000))
|
||||
|
||||
if since_ms:
|
||||
return self._async_get_historic_ohlcv(
|
||||
@@ -2669,7 +2673,7 @@ class Exchange:
|
||||
)
|
||||
|
||||
if type(since) is datetime:
|
||||
since = int(since.timestamp()) * 1000 # * 1000 for ms
|
||||
since = dt_ts(since)
|
||||
|
||||
try:
|
||||
funding_history = self._api.fetch_funding_history(
|
||||
@@ -2999,7 +3003,7 @@ class Exchange:
|
||||
|
||||
if not close_date:
|
||||
close_date = datetime.now(timezone.utc)
|
||||
since_ms = int(timeframe_to_prev_date(timeframe, open_date).timestamp()) * 1000
|
||||
since_ms = dt_ts(timeframe_to_prev_date(timeframe, open_date))
|
||||
|
||||
mark_comb: PairWithTimeframe = (pair, timeframe, mark_price_type)
|
||||
funding_comb: PairWithTimeframe = (pair, timeframe_ff, CandleType.FUNDING_RATE)
|
||||
|
||||
@@ -962,7 +962,7 @@ class FreqtradeBot(LoggingMixin):
|
||||
# edge-case for now.
|
||||
min_stake_amount = self.exchange.get_min_pair_stake_amount(
|
||||
pair, enter_limit_requested,
|
||||
self.strategy.stoploss if not mode != 'pos_adjust' else 0.0,
|
||||
self.strategy.stoploss if not mode == 'pos_adjust' else 0.0,
|
||||
leverage)
|
||||
max_stake_amount = self.exchange.get_max_pair_stake_amount(
|
||||
pair, enter_limit_requested, leverage)
|
||||
|
||||
@@ -33,8 +33,8 @@ from freqtrade.optimize.optimize_reports import (generate_backtest_stats, genera
|
||||
show_backtest_results,
|
||||
store_backtest_analysis_results,
|
||||
store_backtest_stats)
|
||||
from freqtrade.persistence import (LocalTrade, Order, PairLocks, Trade, disable_database_use,
|
||||
enable_database_use)
|
||||
from freqtrade.persistence import (CustomDataWrapper, LocalTrade, Order, PairLocks, Trade,
|
||||
disable_database_use, enable_database_use)
|
||||
from freqtrade.plugins.pairlistmanager import PairListManager
|
||||
from freqtrade.plugins.protectionmanager import ProtectionManager
|
||||
from freqtrade.resolvers import ExchangeResolver, StrategyResolver
|
||||
@@ -337,6 +337,7 @@ class Backtesting:
|
||||
self.disable_database_use()
|
||||
PairLocks.reset_locks()
|
||||
Trade.reset_trades()
|
||||
CustomDataWrapper.reset_custom_data()
|
||||
self.rejected_trades = 0
|
||||
self.timedout_entry_orders = 0
|
||||
self.timedout_exit_orders = 0
|
||||
|
||||
@@ -1,5 +1,6 @@
|
||||
# flake8: noqa: F401
|
||||
|
||||
from freqtrade.persistence.custom_data import CustomDataWrapper
|
||||
from freqtrade.persistence.key_value_store import KeyStoreKeys, KeyValueStore
|
||||
from freqtrade.persistence.models import init_db
|
||||
from freqtrade.persistence.pairlock_middleware import PairLocks
|
||||
|
||||
174
freqtrade/persistence/custom_data.py
Normal file
174
freqtrade/persistence/custom_data.py
Normal file
@@ -0,0 +1,174 @@
|
||||
import json
|
||||
import logging
|
||||
from datetime import datetime
|
||||
from typing import Any, ClassVar, List, Optional, Sequence
|
||||
|
||||
from sqlalchemy import DateTime, ForeignKey, Integer, String, Text, UniqueConstraint, select
|
||||
from sqlalchemy.orm import Mapped, mapped_column, relationship
|
||||
|
||||
from freqtrade.constants import DATETIME_PRINT_FORMAT
|
||||
from freqtrade.persistence.base import ModelBase, SessionType
|
||||
from freqtrade.util import dt_now
|
||||
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class _CustomData(ModelBase):
|
||||
"""
|
||||
CustomData database model
|
||||
Keeps records of metadata as key/value store
|
||||
for trades or global persistant values
|
||||
One to many relationship with Trades:
|
||||
- One trade can have many metadata entries
|
||||
- One metadata entry can only be associated with one Trade
|
||||
"""
|
||||
__tablename__ = 'trade_custom_data'
|
||||
__allow_unmapped__ = True
|
||||
session: ClassVar[SessionType]
|
||||
|
||||
# Uniqueness should be ensured over pair, order_id
|
||||
# its likely that order_id is unique per Pair on some exchanges.
|
||||
__table_args__ = (UniqueConstraint('ft_trade_id', 'cd_key', name="_trade_id_cd_key"),)
|
||||
|
||||
id = mapped_column(Integer, primary_key=True)
|
||||
ft_trade_id = mapped_column(Integer, ForeignKey('trades.id'), index=True)
|
||||
|
||||
trade = relationship("Trade", back_populates="custom_data")
|
||||
|
||||
cd_key: Mapped[str] = mapped_column(String(255), nullable=False)
|
||||
cd_type: Mapped[str] = mapped_column(String(25), nullable=False)
|
||||
cd_value: Mapped[str] = mapped_column(Text, nullable=False)
|
||||
created_at: Mapped[datetime] = mapped_column(DateTime, nullable=False, default=dt_now)
|
||||
updated_at: Mapped[Optional[datetime]] = mapped_column(DateTime, nullable=True)
|
||||
|
||||
# Empty container value - not persisted, but filled with cd_value on query
|
||||
value: Any = None
|
||||
|
||||
def __repr__(self):
|
||||
create_time = (self.created_at.strftime(DATETIME_PRINT_FORMAT)
|
||||
if self.created_at is not None else None)
|
||||
update_time = (self.updated_at.strftime(DATETIME_PRINT_FORMAT)
|
||||
if self.updated_at is not None else None)
|
||||
return (f'CustomData(id={self.id}, key={self.cd_key}, type={self.cd_type}, ' +
|
||||
f'value={self.cd_value}, trade_id={self.ft_trade_id}, created={create_time}, ' +
|
||||
f'updated={update_time})')
|
||||
|
||||
@classmethod
|
||||
def query_cd(cls, key: Optional[str] = None,
|
||||
trade_id: Optional[int] = None) -> Sequence['_CustomData']:
|
||||
"""
|
||||
Get all CustomData, if trade_id is not specified
|
||||
return will be for generic values not tied to a trade
|
||||
:param trade_id: id of the Trade
|
||||
"""
|
||||
filters = []
|
||||
if trade_id is not None:
|
||||
filters.append(_CustomData.ft_trade_id == trade_id)
|
||||
if key is not None:
|
||||
filters.append(_CustomData.cd_key.ilike(key))
|
||||
|
||||
return _CustomData.session.scalars(select(_CustomData).filter(*filters)).all()
|
||||
|
||||
|
||||
class CustomDataWrapper:
|
||||
"""
|
||||
CustomData middleware class
|
||||
Abstracts the database layer away so it becomes optional - which will be necessary to support
|
||||
backtesting and hyperopt in the future.
|
||||
"""
|
||||
|
||||
use_db = True
|
||||
custom_data: List[_CustomData] = []
|
||||
unserialized_types = ['bool', 'float', 'int', 'str']
|
||||
|
||||
@staticmethod
|
||||
def _convert_custom_data(data: _CustomData) -> _CustomData:
|
||||
if data.cd_type in CustomDataWrapper.unserialized_types:
|
||||
data.value = data.cd_value
|
||||
if data.cd_type == 'bool':
|
||||
data.value = data.cd_value.lower() == 'true'
|
||||
elif data.cd_type == 'int':
|
||||
data.value = int(data.cd_value)
|
||||
elif data.cd_type == 'float':
|
||||
data.value = float(data.cd_value)
|
||||
else:
|
||||
data.value = json.loads(data.cd_value)
|
||||
return data
|
||||
|
||||
@staticmethod
|
||||
def reset_custom_data() -> None:
|
||||
"""
|
||||
Resets all key-value pairs. Only active for backtesting mode.
|
||||
"""
|
||||
if not CustomDataWrapper.use_db:
|
||||
CustomDataWrapper.custom_data = []
|
||||
|
||||
@staticmethod
|
||||
def delete_custom_data(trade_id: int) -> None:
|
||||
_CustomData.session.query(_CustomData).filter(_CustomData.ft_trade_id == trade_id).delete()
|
||||
_CustomData.session.commit()
|
||||
|
||||
@staticmethod
|
||||
def get_custom_data(*, trade_id: int, key: Optional[str] = None) -> List[_CustomData]:
|
||||
|
||||
if CustomDataWrapper.use_db:
|
||||
filters = [
|
||||
_CustomData.ft_trade_id == trade_id,
|
||||
]
|
||||
if key is not None:
|
||||
filters.append(_CustomData.cd_key.ilike(key))
|
||||
filtered_custom_data = _CustomData.session.scalars(select(_CustomData).filter(
|
||||
*filters)).all()
|
||||
|
||||
else:
|
||||
filtered_custom_data = [
|
||||
data_entry for data_entry in CustomDataWrapper.custom_data
|
||||
if (data_entry.ft_trade_id == trade_id)
|
||||
]
|
||||
if key is not None:
|
||||
filtered_custom_data = [
|
||||
data_entry for data_entry in filtered_custom_data
|
||||
if (data_entry.cd_key.casefold() == key.casefold())
|
||||
]
|
||||
return [CustomDataWrapper._convert_custom_data(d) for d in filtered_custom_data]
|
||||
|
||||
@staticmethod
|
||||
def set_custom_data(trade_id: int, key: str, value: Any) -> None:
|
||||
|
||||
value_type = type(value).__name__
|
||||
|
||||
if value_type not in CustomDataWrapper.unserialized_types:
|
||||
try:
|
||||
value_db = json.dumps(value)
|
||||
except TypeError as e:
|
||||
logger.warning(f"could not serialize {key} value due to {e}")
|
||||
return
|
||||
else:
|
||||
value_db = str(value)
|
||||
|
||||
if trade_id is None:
|
||||
trade_id = 0
|
||||
|
||||
custom_data = CustomDataWrapper.get_custom_data(trade_id=trade_id, key=key)
|
||||
if custom_data:
|
||||
data_entry = custom_data[0]
|
||||
data_entry.cd_value = value_db
|
||||
data_entry.updated_at = dt_now()
|
||||
else:
|
||||
data_entry = _CustomData(
|
||||
ft_trade_id=trade_id,
|
||||
cd_key=key,
|
||||
cd_type=value_type,
|
||||
cd_value=value_db,
|
||||
created_at=dt_now(),
|
||||
)
|
||||
data_entry.value = value
|
||||
|
||||
if CustomDataWrapper.use_db and value_db is not None:
|
||||
_CustomData.session.add(data_entry)
|
||||
_CustomData.session.commit()
|
||||
else:
|
||||
if not custom_data:
|
||||
CustomDataWrapper.custom_data.append(data_entry)
|
||||
# Existing data will have updated interactively.
|
||||
@@ -13,6 +13,7 @@ from sqlalchemy.pool import StaticPool
|
||||
|
||||
from freqtrade.exceptions import OperationalException
|
||||
from freqtrade.persistence.base import ModelBase
|
||||
from freqtrade.persistence.custom_data import _CustomData
|
||||
from freqtrade.persistence.key_value_store import _KeyValueStoreModel
|
||||
from freqtrade.persistence.migrations import check_migrate
|
||||
from freqtrade.persistence.pairlock import PairLock
|
||||
@@ -78,6 +79,8 @@ def init_db(db_url: str) -> None:
|
||||
Order.session = Trade.session
|
||||
PairLock.session = Trade.session
|
||||
_KeyValueStoreModel.session = Trade.session
|
||||
_CustomData.session = scoped_session(sessionmaker(bind=engine, autoflush=True),
|
||||
scopefunc=get_request_or_thread_id)
|
||||
|
||||
previous_tables = inspect(engine).get_table_names()
|
||||
ModelBase.metadata.create_all(engine)
|
||||
|
||||
@@ -23,6 +23,7 @@ from freqtrade.exchange import (ROUND_DOWN, ROUND_UP, amount_to_contract_precisi
|
||||
from freqtrade.leverage import interest
|
||||
from freqtrade.misc import safe_value_fallback
|
||||
from freqtrade.persistence.base import ModelBase, SessionType
|
||||
from freqtrade.persistence.custom_data import CustomDataWrapper, _CustomData
|
||||
from freqtrade.util import FtPrecise, dt_from_ts, dt_now, dt_ts, dt_ts_none
|
||||
|
||||
|
||||
@@ -1214,6 +1215,40 @@ class LocalTrade:
|
||||
or (o.ft_is_open is True and o.status is not None)
|
||||
]
|
||||
|
||||
def set_custom_data(self, key: str, value: Any) -> None:
|
||||
"""
|
||||
Set custom data for this trade
|
||||
:param key: key of the custom data
|
||||
:param value: value of the custom data (must be JSON serializable)
|
||||
"""
|
||||
CustomDataWrapper.set_custom_data(trade_id=self.id, key=key, value=value)
|
||||
|
||||
def get_custom_data(self, key: str, default: Any = None) -> Any:
|
||||
"""
|
||||
Get custom data for this trade
|
||||
:param key: key of the custom data
|
||||
"""
|
||||
data = CustomDataWrapper.get_custom_data(trade_id=self.id, key=key)
|
||||
if data:
|
||||
return data[0].value
|
||||
return default
|
||||
|
||||
def get_custom_data_entry(self, key: str) -> Optional[_CustomData]:
|
||||
"""
|
||||
Get custom data for this trade
|
||||
:param key: key of the custom data
|
||||
"""
|
||||
data = CustomDataWrapper.get_custom_data(trade_id=self.id, key=key)
|
||||
if data:
|
||||
return data[0]
|
||||
return None
|
||||
|
||||
def get_all_custom_data(self) -> List[_CustomData]:
|
||||
"""
|
||||
Get all custom data for this trade
|
||||
"""
|
||||
return CustomDataWrapper.get_custom_data(trade_id=self.id)
|
||||
|
||||
@property
|
||||
def nr_of_successful_entries(self) -> int:
|
||||
"""
|
||||
@@ -1469,6 +1504,9 @@ class Trade(ModelBase, LocalTrade):
|
||||
orders: Mapped[List[Order]] = relationship(
|
||||
"Order", order_by="Order.id", cascade="all, delete-orphan", lazy="selectin",
|
||||
innerjoin=True) # type: ignore
|
||||
custom_data: Mapped[List[_CustomData]] = relationship(
|
||||
"_CustomData", cascade="all, delete-orphan",
|
||||
lazy="raise")
|
||||
|
||||
exchange: Mapped[str] = mapped_column(String(25), nullable=False) # type: ignore
|
||||
pair: Mapped[str] = mapped_column(String(25), nullable=False, index=True) # type: ignore
|
||||
@@ -1572,6 +1610,8 @@ class Trade(ModelBase, LocalTrade):
|
||||
for order in self.orders:
|
||||
Order.session.delete(order)
|
||||
|
||||
CustomDataWrapper.delete_custom_data(trade_id=self.id)
|
||||
|
||||
Trade.session.delete(self)
|
||||
Trade.commit()
|
||||
|
||||
|
||||
@@ -1,4 +1,5 @@
|
||||
|
||||
from freqtrade.persistence.custom_data import CustomDataWrapper
|
||||
from freqtrade.persistence.pairlock_middleware import PairLocks
|
||||
from freqtrade.persistence.trade_model import Trade
|
||||
|
||||
@@ -11,6 +12,7 @@ def disable_database_use(timeframe: str) -> None:
|
||||
PairLocks.use_db = False
|
||||
PairLocks.timeframe = timeframe
|
||||
Trade.use_db = False
|
||||
CustomDataWrapper.use_db = False
|
||||
|
||||
|
||||
def enable_database_use() -> None:
|
||||
@@ -20,6 +22,7 @@ def enable_database_use() -> None:
|
||||
PairLocks.use_db = True
|
||||
PairLocks.timeframe = ''
|
||||
Trade.use_db = True
|
||||
CustomDataWrapper.use_db = True
|
||||
|
||||
|
||||
class FtNoDBContext:
|
||||
|
||||
@@ -559,3 +559,7 @@ class SysInfo(BaseModel):
|
||||
class Health(BaseModel):
|
||||
last_process: Optional[datetime] = None
|
||||
last_process_ts: Optional[int] = None
|
||||
bot_start: Optional[datetime] = None
|
||||
bot_start_ts: Optional[int] = None
|
||||
bot_startup: Optional[datetime] = None
|
||||
bot_startup_ts: Optional[int] = None
|
||||
|
||||
@@ -291,6 +291,10 @@ class RPC:
|
||||
profit_str += f" ({fiat_profit:.2f})"
|
||||
fiat_profit_sum = fiat_profit if isnan(fiat_profit_sum) \
|
||||
else fiat_profit_sum + fiat_profit
|
||||
else:
|
||||
profit_str += f" ({trade_profit:.2f})"
|
||||
fiat_profit_sum = trade_profit if isnan(fiat_profit_sum) \
|
||||
else fiat_profit_sum + trade_profit
|
||||
|
||||
active_attempt_side_symbols = [
|
||||
'*' if (oo and oo.ft_order_side == trade.entry_side) else '**'
|
||||
@@ -317,6 +321,8 @@ class RPC:
|
||||
profitcol = "Profit"
|
||||
if self._fiat_converter:
|
||||
profitcol += " (" + fiat_display_currency + ")"
|
||||
else:
|
||||
profitcol += " (" + stake_currency + ")"
|
||||
|
||||
columns = [
|
||||
'ID L/S' if nonspot else 'ID',
|
||||
@@ -927,6 +933,7 @@ class RPC:
|
||||
is_short=is_short,
|
||||
enter_tag=enter_tag,
|
||||
leverage_=leverage,
|
||||
mode='pos_adjust' if trade else 'initial'
|
||||
):
|
||||
Trade.commit()
|
||||
trade = Trade.get_trades([Trade.is_open.is_(True), Trade.pair == pair]).first()
|
||||
@@ -999,6 +1006,32 @@ class RPC:
|
||||
'cancel_order_count': c_count,
|
||||
}
|
||||
|
||||
def _rpc_list_custom_data(self, trade_id: int, key: Optional[str]) -> List[Dict[str, Any]]:
|
||||
# Query for trade
|
||||
trade = Trade.get_trades(trade_filter=[Trade.id == trade_id]).first()
|
||||
if trade is None:
|
||||
return []
|
||||
# Query custom_data
|
||||
custom_data = []
|
||||
if key:
|
||||
data = trade.get_custom_data(key=key)
|
||||
if data:
|
||||
custom_data = [data]
|
||||
else:
|
||||
custom_data = trade.get_all_custom_data()
|
||||
return [
|
||||
{
|
||||
'id': data_entry.id,
|
||||
'ft_trade_id': data_entry.ft_trade_id,
|
||||
'cd_key': data_entry.cd_key,
|
||||
'cd_type': data_entry.cd_type,
|
||||
'cd_value': data_entry.cd_value,
|
||||
'created_at': data_entry.created_at,
|
||||
'updated_at': data_entry.updated_at
|
||||
}
|
||||
for data_entry in custom_data
|
||||
]
|
||||
|
||||
def _rpc_performance(self) -> List[Dict[str, Any]]:
|
||||
"""
|
||||
Handler for performance.
|
||||
@@ -1333,19 +1366,40 @@ class RPC:
|
||||
|
||||
def health(self) -> Dict[str, Optional[Union[str, int]]]:
|
||||
last_p = self._freqtrade.last_process
|
||||
if last_p is None:
|
||||
return {
|
||||
"last_process": None,
|
||||
"last_process_loc": None,
|
||||
"last_process_ts": None,
|
||||
}
|
||||
|
||||
return {
|
||||
"last_process": str(last_p),
|
||||
"last_process_loc": format_date(last_p.astimezone(tzlocal())),
|
||||
"last_process_ts": int(last_p.timestamp()),
|
||||
res: Dict[str, Union[None, str, int]] = {
|
||||
"last_process": None,
|
||||
"last_process_loc": None,
|
||||
"last_process_ts": None,
|
||||
"bot_start": None,
|
||||
"bot_start_loc": None,
|
||||
"bot_start_ts": None,
|
||||
"bot_startup": None,
|
||||
"bot_startup_loc": None,
|
||||
"bot_startup_ts": None,
|
||||
}
|
||||
|
||||
if last_p is not None:
|
||||
res.update({
|
||||
"last_process": str(last_p),
|
||||
"last_process_loc": format_date(last_p.astimezone(tzlocal())),
|
||||
"last_process_ts": int(last_p.timestamp()),
|
||||
})
|
||||
|
||||
if (bot_start := KeyValueStore.get_datetime_value(KeyStoreKeys.BOT_START_TIME)):
|
||||
res.update({
|
||||
"bot_start": str(bot_start),
|
||||
"bot_start_loc": format_date(bot_start.astimezone(tzlocal())),
|
||||
"bot_start_ts": int(bot_start.timestamp()),
|
||||
})
|
||||
if (bot_startup := KeyValueStore.get_datetime_value(KeyStoreKeys.STARTUP_TIME)):
|
||||
res.update({
|
||||
"bot_startup": str(bot_startup),
|
||||
"bot_startup_loc": format_date(bot_startup.astimezone(tzlocal())),
|
||||
"bot_startup_ts": int(bot_startup.timestamp()),
|
||||
})
|
||||
|
||||
return res
|
||||
|
||||
def _update_market_direction(self, direction: MarketDirection) -> None:
|
||||
self._freqtrade.strategy.market_direction = direction
|
||||
|
||||
|
||||
@@ -33,7 +33,7 @@ from freqtrade.misc import chunks, plural
|
||||
from freqtrade.persistence import Trade
|
||||
from freqtrade.rpc import RPC, RPCException, RPCHandler
|
||||
from freqtrade.rpc.rpc_types import RPCEntryMsg, RPCExitMsg, RPCOrderMsg, RPCSendMsg
|
||||
from freqtrade.util import dt_humanize, fmt_coin, round_value
|
||||
from freqtrade.util import dt_humanize, fmt_coin, format_date, round_value
|
||||
|
||||
|
||||
MAX_MESSAGE_LENGTH = MessageLimit.MAX_TEXT_LENGTH
|
||||
@@ -243,6 +243,7 @@ class Telegram(RPCHandler):
|
||||
CommandHandler('version', self._version),
|
||||
CommandHandler('marketdir', self._changemarketdir),
|
||||
CommandHandler('order', self._order),
|
||||
CommandHandler('list_custom_data', self._list_custom_data),
|
||||
]
|
||||
callbacks = [
|
||||
CallbackQueryHandler(self._status_table, pattern='update_status_table'),
|
||||
@@ -1667,6 +1668,8 @@ class Telegram(RPCHandler):
|
||||
"*/marketdir [long | short | even | none]:* `Updates the user managed variable "
|
||||
"that represents the current market direction. If no direction is provided `"
|
||||
"`the currently set market direction will be output.` \n"
|
||||
"*/list_custom_data <trade_id> <key>:* `List custom_data for Trade ID & Key combo.`\n"
|
||||
"`If no Key is supplied it will list all key-value pairs found for that Trade ID.`"
|
||||
|
||||
"_Statistics_\n"
|
||||
"------------\n"
|
||||
@@ -1689,7 +1692,7 @@ class Telegram(RPCHandler):
|
||||
"*/stats:* `Shows Wins / losses by Sell reason as well as "
|
||||
"Avg. holding durations for buys and sells.`\n"
|
||||
"*/help:* `This help message`\n"
|
||||
"*/version:* `Show version`"
|
||||
"*/version:* `Show version`\n"
|
||||
)
|
||||
|
||||
await self._send_msg(message, parse_mode=ParseMode.MARKDOWN)
|
||||
@@ -1701,7 +1704,9 @@ class Telegram(RPCHandler):
|
||||
Shows the last process timestamp
|
||||
"""
|
||||
health = self._rpc.health()
|
||||
message = f"Last process: `{health['last_process_loc']}`"
|
||||
message = f"Last process: `{health['last_process_loc']}`\n"
|
||||
message += f"Initial bot start: `{health['bot_start_loc']}`\n"
|
||||
message += f"Last bot restart: `{health['bot_startup_loc']}`"
|
||||
await self._send_msg(message)
|
||||
|
||||
@authorized_only
|
||||
@@ -1766,6 +1771,53 @@ class Telegram(RPCHandler):
|
||||
f"*Current state:* `{val['state']}`"
|
||||
)
|
||||
|
||||
@authorized_only
|
||||
async def _list_custom_data(self, update: Update, context: CallbackContext) -> None:
|
||||
"""
|
||||
Handler for /list_custom_data <id> <key>.
|
||||
List custom_data for specified trade (and key if supplied).
|
||||
:param bot: telegram bot
|
||||
:param update: message update
|
||||
:return: None
|
||||
"""
|
||||
try:
|
||||
if not context.args or len(context.args) == 0:
|
||||
raise RPCException("Trade-id not set.")
|
||||
trade_id = int(context.args[0])
|
||||
key = None if len(context.args) < 2 else str(context.args[1])
|
||||
|
||||
results = self._rpc._rpc_list_custom_data(trade_id, key)
|
||||
messages = []
|
||||
if len(results) > 0:
|
||||
messages.append(
|
||||
'Found custom-data entr' + ('ies: ' if len(results) > 1 else 'y: ')
|
||||
)
|
||||
for result in results:
|
||||
lines = [
|
||||
f"*Key:* `{result['cd_key']}`",
|
||||
f"*ID:* `{result['id']}`",
|
||||
f"*Trade ID:* `{result['ft_trade_id']}`",
|
||||
f"*Type:* `{result['cd_type']}`",
|
||||
f"*Value:* `{result['cd_value']}`",
|
||||
f"*Create Date:* `{format_date(result['created_at'])}`",
|
||||
f"*Update Date:* `{format_date(result['updated_at'])}`"
|
||||
]
|
||||
# Filter empty lines using list-comprehension
|
||||
messages.append("\n".join([line for line in lines if line]))
|
||||
for msg in messages:
|
||||
if len(msg) > MAX_MESSAGE_LENGTH:
|
||||
msg = "Message dropped because length exceeds "
|
||||
msg += f"maximum allowed characters: {MAX_MESSAGE_LENGTH}"
|
||||
logger.warning(msg)
|
||||
await self._send_msg(msg)
|
||||
else:
|
||||
message = f"Didn't find any custom-data entries for Trade ID: `{trade_id}`"
|
||||
message += f" and Key: `{key}`." if key is not None else ""
|
||||
await self._send_msg(message)
|
||||
|
||||
except RPCException as e:
|
||||
await self._send_msg(str(e))
|
||||
|
||||
async def _update_msg(self, query: CallbackQuery, msg: str, callback_path: str = "",
|
||||
reload_able: bool = False, parse_mode: str = ParseMode.MARKDOWN) -> None:
|
||||
if reload_able:
|
||||
|
||||
@@ -35,7 +35,7 @@
|
||||
"project_root = \"somedir/freqtrade\"\n",
|
||||
"i=0\n",
|
||||
"try:\n",
|
||||
" os.chdirdir(project_root)\n",
|
||||
" os.chdir(project_root)\n",
|
||||
" assert Path('LICENSE').is_file()\n",
|
||||
"except:\n",
|
||||
" while i<4 and (not Path('LICENSE').is_file()):\n",
|
||||
@@ -181,7 +181,7 @@
|
||||
"\n",
|
||||
"# if backtest_dir points to a directory, it'll automatically load the last backtest file.\n",
|
||||
"backtest_dir = config[\"user_data_dir\"] / \"backtest_results\"\n",
|
||||
"# backtest_dir can also point to a specific file \n",
|
||||
"# backtest_dir can also point to a specific file\n",
|
||||
"# backtest_dir = config[\"user_data_dir\"] / \"backtest_results/backtest-result-2020-07-01_20-04-22.json\""
|
||||
]
|
||||
},
|
||||
|
||||
@@ -8,10 +8,10 @@
|
||||
|
||||
coveralls==3.3.1
|
||||
ruff==0.3.0
|
||||
mypy==1.8.0
|
||||
mypy==1.9.0
|
||||
pre-commit==3.6.2
|
||||
pytest==8.1.0
|
||||
pytest-asyncio==0.23.5
|
||||
pytest==8.1.1
|
||||
pytest-asyncio==0.23.5.post1
|
||||
pytest-cov==4.1.0
|
||||
pytest-mock==3.12.0
|
||||
pytest-random-order==1.1.1
|
||||
@@ -21,11 +21,11 @@ isort==5.13.2
|
||||
time-machine==2.14.0
|
||||
|
||||
# Convert jupyter notebooks to markdown documents
|
||||
nbconvert==7.16.1
|
||||
nbconvert==7.16.2
|
||||
|
||||
# mypy types
|
||||
types-cachetools==5.3.0.7
|
||||
types-filelock==3.2.7
|
||||
types-requests==2.31.0.20240218
|
||||
types-requests==2.31.0.20240311
|
||||
types-tabulate==0.9.0.20240106
|
||||
types-python-dateutil==2.8.19.20240106
|
||||
types-python-dateutil==2.8.19.20240311
|
||||
|
||||
@@ -5,7 +5,7 @@
|
||||
# Required for freqai
|
||||
scikit-learn==1.4.1.post1
|
||||
joblib==1.3.2
|
||||
catboost==1.2.2; 'arm' not in platform_machine and python_version < '3.12'
|
||||
catboost==1.2.3; 'arm' not in platform_machine
|
||||
lightgbm==4.3.0
|
||||
xgboost==2.0.3
|
||||
tensorboard==2.16.2
|
||||
|
||||
@@ -2,11 +2,11 @@ numpy==1.26.4
|
||||
pandas==2.2.1
|
||||
pandas-ta==0.3.14b
|
||||
|
||||
ccxt==4.2.58
|
||||
ccxt==4.2.66
|
||||
cryptography==42.0.5
|
||||
aiohttp==3.9.3
|
||||
SQLAlchemy==2.0.27
|
||||
python-telegram-bot==20.8
|
||||
python-telegram-bot==21.0.1
|
||||
# can't be hard-pinned due to telegram-bot pinning httpx with ~
|
||||
httpx>=0.24.1
|
||||
arrow==1.3.0
|
||||
@@ -60,4 +60,4 @@ websockets==12.0
|
||||
janus==1.0.0
|
||||
|
||||
ast-comments==1.2.1
|
||||
packaging==23.2
|
||||
packaging==24.0
|
||||
|
||||
2
setup.sh
2
setup.sh
@@ -161,7 +161,7 @@ function install_macos() {
|
||||
/usr/bin/ruby -e "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/master/install)"
|
||||
fi
|
||||
|
||||
brew install gettext
|
||||
brew install gettext libomp
|
||||
|
||||
#Gets number after decimal in python version
|
||||
version=$(egrep -o 3.\[0-9\]+ <<< $PYTHON | sed 's/3.//g')
|
||||
|
||||
@@ -25,10 +25,15 @@ def is_mac() -> bool:
|
||||
return "Darwin" in machine
|
||||
|
||||
|
||||
def is_arm() -> bool:
|
||||
machine = platform.machine()
|
||||
return "arm" in machine or "aarch64" in machine
|
||||
|
||||
|
||||
@pytest.fixture(autouse=True)
|
||||
def patch_torch_initlogs(mocker) -> None:
|
||||
|
||||
if is_mac():
|
||||
if is_mac() and not is_arm():
|
||||
# Mock torch import completely
|
||||
import sys
|
||||
import types
|
||||
|
||||
@@ -1,5 +1,4 @@
|
||||
import logging
|
||||
import platform
|
||||
import shutil
|
||||
from pathlib import Path
|
||||
from unittest.mock import MagicMock
|
||||
@@ -15,19 +14,14 @@ from freqtrade.optimize.backtesting import Backtesting
|
||||
from freqtrade.persistence import Trade
|
||||
from freqtrade.plugins.pairlistmanager import PairListManager
|
||||
from tests.conftest import EXMS, create_mock_trades, get_patched_exchange, log_has_re
|
||||
from tests.freqai.conftest import (get_patched_freqai_strategy, is_mac, is_py12, make_rl_config,
|
||||
mock_pytorch_mlp_model_training_parameters)
|
||||
|
||||
|
||||
def is_arm() -> bool:
|
||||
machine = platform.machine()
|
||||
return "arm" in machine or "aarch64" in machine
|
||||
from tests.freqai.conftest import (get_patched_freqai_strategy, is_arm, is_mac, is_py12,
|
||||
make_rl_config, mock_pytorch_mlp_model_training_parameters)
|
||||
|
||||
|
||||
def can_run_model(model: str) -> None:
|
||||
is_pytorch_model = 'Reinforcement' in model or 'PyTorch' in model
|
||||
|
||||
if is_py12() and ("Catboost" in model or is_pytorch_model):
|
||||
if is_py12() and is_pytorch_model:
|
||||
pytest.skip("Model not supported on python 3.12 yet.")
|
||||
|
||||
if is_arm() and "Catboost" in model:
|
||||
@@ -243,7 +237,7 @@ def test_extract_data_and_train_model_Classifiers(mocker, freqai_conf, model):
|
||||
def test_start_backtesting(mocker, freqai_conf, model, num_files, strat, caplog):
|
||||
can_run_model(model)
|
||||
test_tb = True
|
||||
if is_mac():
|
||||
if is_mac() and not is_arm():
|
||||
test_tb = False
|
||||
|
||||
freqai_conf.get("freqai", {}).update({"save_backtest_models": True})
|
||||
|
||||
@@ -2099,6 +2099,7 @@ def test_Trade_object_idem():
|
||||
'get_mix_tag_performance',
|
||||
'get_trading_volume',
|
||||
'validate_string_len',
|
||||
'custom_data'
|
||||
)
|
||||
EXCLUDES2 = ('trades', 'trades_open', 'bt_trades_open_pp', 'bt_open_open_trade_count',
|
||||
'total_profit', 'from_json',)
|
||||
|
||||
160
tests/persistence/test_trade_custom_data.py
Normal file
160
tests/persistence/test_trade_custom_data.py
Normal file
@@ -0,0 +1,160 @@
|
||||
from copy import deepcopy
|
||||
from unittest.mock import MagicMock
|
||||
|
||||
import pytest
|
||||
|
||||
from freqtrade.data.history.history_utils import get_timerange
|
||||
from freqtrade.optimize.backtesting import Backtesting
|
||||
from freqtrade.persistence import Trade, disable_database_use, enable_database_use
|
||||
from freqtrade.persistence.custom_data import CustomDataWrapper
|
||||
from tests.conftest import (EXMS, create_mock_trades_usdt, generate_test_data,
|
||||
get_patched_freqtradebot, patch_exchange)
|
||||
|
||||
|
||||
@pytest.mark.usefixtures("init_persistence")
|
||||
@pytest.mark.parametrize("use_db", [True, False])
|
||||
def test_trade_custom_data(fee, use_db):
|
||||
if not use_db:
|
||||
disable_database_use('5m')
|
||||
Trade.reset_trades()
|
||||
CustomDataWrapper.reset_custom_data()
|
||||
|
||||
create_mock_trades_usdt(fee, use_db=use_db)
|
||||
|
||||
trade1 = Trade.get_trades_proxy()[0]
|
||||
if not use_db:
|
||||
trade1.id = 1
|
||||
|
||||
assert trade1.get_all_custom_data() == []
|
||||
trade1.set_custom_data('test_str', 'test_value')
|
||||
trade1.set_custom_data('test_int', 1)
|
||||
trade1.set_custom_data('test_float', 1.55)
|
||||
trade1.set_custom_data('test_bool', True)
|
||||
trade1.set_custom_data('test_dict', {'test': 'dict'})
|
||||
|
||||
assert len(trade1.get_all_custom_data()) == 5
|
||||
assert trade1.get_custom_data('test_str') == 'test_value'
|
||||
trade1.set_custom_data('test_str', 'test_value_updated')
|
||||
assert trade1.get_custom_data('test_str') == 'test_value_updated'
|
||||
|
||||
assert trade1.get_custom_data('test_int') == 1
|
||||
assert isinstance(trade1.get_custom_data('test_int'), int)
|
||||
|
||||
assert trade1.get_custom_data('test_float') == 1.55
|
||||
assert isinstance(trade1.get_custom_data('test_float'), float)
|
||||
|
||||
assert trade1.get_custom_data('test_bool') is True
|
||||
assert isinstance(trade1.get_custom_data('test_bool'), bool)
|
||||
|
||||
assert trade1.get_custom_data('test_dict') == {'test': 'dict'}
|
||||
assert isinstance(trade1.get_custom_data('test_dict'), dict)
|
||||
if not use_db:
|
||||
enable_database_use()
|
||||
|
||||
|
||||
def test_trade_custom_data_strategy_compat(mocker, default_conf_usdt, fee):
|
||||
|
||||
mocker.patch(f'{EXMS}.get_rate', return_value=0.50)
|
||||
mocker.patch('freqtrade.freqtradebot.FreqtradeBot.get_real_amount', return_value=None)
|
||||
default_conf_usdt["minimal_roi"] = {"0": 100}
|
||||
|
||||
freqtrade = get_patched_freqtradebot(mocker, default_conf_usdt)
|
||||
create_mock_trades_usdt(fee)
|
||||
|
||||
trade1 = Trade.get_trades_proxy(pair='ADA/USDT')[0]
|
||||
trade1.set_custom_data('test_str', 'test_value')
|
||||
trade1.set_custom_data('test_int', 1)
|
||||
|
||||
def custom_exit(pair, trade, **kwargs):
|
||||
|
||||
if pair == 'ADA/USDT':
|
||||
custom_val = trade.get_custom_data('test_str')
|
||||
custom_val_i = trade.get_custom_data('test_int')
|
||||
|
||||
return f"{custom_val}_{custom_val_i}"
|
||||
|
||||
freqtrade.strategy.custom_exit = custom_exit
|
||||
ff_spy = mocker.spy(freqtrade.strategy, 'custom_exit')
|
||||
trades = Trade.get_open_trades()
|
||||
freqtrade.exit_positions(trades)
|
||||
Trade.commit()
|
||||
|
||||
trade_after = Trade.get_trades_proxy(pair='ADA/USDT')[0]
|
||||
assert trade_after.get_custom_data('test_str') == 'test_value'
|
||||
assert trade_after.get_custom_data('test_int') == 1
|
||||
# 2 open pairs eligible for exit
|
||||
assert ff_spy.call_count == 2
|
||||
|
||||
assert trade_after.exit_reason == 'test_value_1'
|
||||
|
||||
|
||||
def test_trade_custom_data_strategy_backtest_compat(mocker, default_conf_usdt, fee):
|
||||
|
||||
mocker.patch(f'{EXMS}.get_fee', fee)
|
||||
mocker.patch(f"{EXMS}.get_min_pair_stake_amount", return_value=10)
|
||||
mocker.patch(f"{EXMS}.get_max_pair_stake_amount", return_value=float('inf'))
|
||||
mocker.patch(f"{EXMS}.get_max_leverage", return_value=10)
|
||||
mocker.patch(f"{EXMS}.get_maintenance_ratio_and_amt", return_value=(0.1, 0.1))
|
||||
mocker.patch('freqtrade.optimize.backtesting.Backtesting._run_funding_fees')
|
||||
|
||||
patch_exchange(mocker)
|
||||
default_conf_usdt.update({
|
||||
"stake_amount": 100.0,
|
||||
"max_open_trades": 2,
|
||||
"dry_run_wallet": 1000.0,
|
||||
"strategy": "StrategyTestV3",
|
||||
"trading_mode": "futures",
|
||||
"margin_mode": "isolated",
|
||||
"stoploss": -2,
|
||||
"minimal_roi": {"0": 100},
|
||||
})
|
||||
default_conf_usdt['pairlists'] = [{'method': 'StaticPairList', 'allow_inactive': True}]
|
||||
backtesting = Backtesting(default_conf_usdt)
|
||||
|
||||
df = generate_test_data(default_conf_usdt['timeframe'], 100, '2022-01-01 00:00:00+00:00')
|
||||
|
||||
pair_exp = 'XRP/USDT:USDT'
|
||||
|
||||
def custom_exit(pair, trade, **kwargs):
|
||||
custom_val = trade.get_custom_data('test_str')
|
||||
custom_val_i = trade.get_custom_data('test_int', 0)
|
||||
|
||||
if pair == pair_exp:
|
||||
trade.set_custom_data('test_str', 'test_value')
|
||||
trade.set_custom_data('test_int', custom_val_i + 1)
|
||||
|
||||
if custom_val_i >= 2:
|
||||
return f"{custom_val}_{custom_val_i}"
|
||||
|
||||
backtesting._set_strategy(backtesting.strategylist[0])
|
||||
processed = backtesting.strategy.advise_all_indicators({
|
||||
pair_exp: df,
|
||||
'BTC/USDT:USDT': df,
|
||||
})
|
||||
|
||||
def fun(dataframe, *args, **kwargs):
|
||||
dataframe.loc[dataframe.index == 50, 'enter_long'] = 1
|
||||
return dataframe
|
||||
|
||||
backtesting.strategy.advise_entry = fun
|
||||
backtesting.strategy.leverage = MagicMock(return_value=1)
|
||||
backtesting.strategy.custom_exit = custom_exit
|
||||
ff_spy = mocker.spy(backtesting.strategy, 'custom_exit')
|
||||
|
||||
min_date, max_date = get_timerange(processed)
|
||||
|
||||
result = backtesting.backtest(
|
||||
processed=deepcopy(processed),
|
||||
start_date=min_date,
|
||||
end_date=max_date,
|
||||
)
|
||||
results = result['results']
|
||||
assert not results.empty
|
||||
assert len(results) == 2
|
||||
assert results['pair'][0] == pair_exp
|
||||
assert results['pair'][1] == 'BTC/USDT:USDT'
|
||||
assert results['exit_reason'][0] == 'test_value_2'
|
||||
assert results['exit_reason'][1] == 'exit_signal'
|
||||
|
||||
assert ff_spy.call_count == 7
|
||||
Backtesting.cleanup()
|
||||
@@ -10,6 +10,7 @@ from freqtrade.edge import PairInfo
|
||||
from freqtrade.enums import SignalDirection, State, TradingMode
|
||||
from freqtrade.exceptions import ExchangeError, InvalidOrderException, TemporaryError
|
||||
from freqtrade.persistence import Order, Trade
|
||||
from freqtrade.persistence.key_value_store import set_startup_time
|
||||
from freqtrade.persistence.pairlock_middleware import PairLocks
|
||||
from freqtrade.rpc import RPC, RPCException
|
||||
from freqtrade.rpc.fiat_convert import CryptoToFiatConverter
|
||||
@@ -223,8 +224,8 @@ def test_rpc_status_table(default_conf, ticker, fee, mocker) -> None:
|
||||
assert "Pair" in headers
|
||||
assert 'instantly' == result[0][2]
|
||||
assert 'ETH/BTC' in result[0][1]
|
||||
assert '0.00' == result[0][3]
|
||||
assert isnan(fiat_profit_sum)
|
||||
assert '0.00 (0.00)' == result[0][3]
|
||||
assert '0.00' == f'{fiat_profit_sum:.2f}'
|
||||
|
||||
mocker.patch(f'{EXMS}._dry_is_price_crossed', return_value=True)
|
||||
freqtradebot.process()
|
||||
@@ -234,8 +235,8 @@ def test_rpc_status_table(default_conf, ticker, fee, mocker) -> None:
|
||||
assert "Pair" in headers
|
||||
assert 'instantly' == result[0][2]
|
||||
assert 'ETH/BTC' in result[0][1]
|
||||
assert '-0.41%' == result[0][3]
|
||||
assert isnan(fiat_profit_sum)
|
||||
assert '-0.41% (-0.00)' == result[0][3]
|
||||
assert '-0.00' == f'{fiat_profit_sum:.2f}'
|
||||
|
||||
# Test with fiat convert
|
||||
rpc._fiat_converter = CryptoToFiatConverter()
|
||||
@@ -1298,6 +1299,7 @@ def test_rpc_health(mocker, default_conf) -> None:
|
||||
mocker.patch('freqtrade.rpc.telegram.Telegram', MagicMock())
|
||||
|
||||
freqtradebot = get_patched_freqtradebot(mocker, default_conf)
|
||||
set_startup_time()
|
||||
rpc = RPC(freqtradebot)
|
||||
result = rpc.health()
|
||||
assert result['last_process'] is None
|
||||
|
||||
@@ -150,7 +150,7 @@ def test_telegram_init(default_conf, mocker, caplog) -> None:
|
||||
"['stopbuy', 'stopentry'], ['whitelist'], ['blacklist'], "
|
||||
"['bl_delete', 'blacklist_delete'], "
|
||||
"['logs'], ['edge'], ['health'], ['help'], ['version'], ['marketdir'], "
|
||||
"['order']]")
|
||||
"['order'], ['list_custom_data']]")
|
||||
|
||||
assert log_has(message_str, caplog)
|
||||
|
||||
@@ -2657,3 +2657,49 @@ async def test_change_market_direction(default_conf, mocker, update) -> None:
|
||||
context.args = ["invalid"]
|
||||
await telegram._changemarketdir(update, context)
|
||||
assert telegram._rpc._freqtrade.strategy.market_direction == MarketDirection.LONG
|
||||
|
||||
|
||||
async def test_telegram_list_custom_data(default_conf_usdt, update, ticker, fee, mocker) -> None:
|
||||
|
||||
mocker.patch.multiple(
|
||||
EXMS,
|
||||
fetch_ticker=ticker,
|
||||
get_fee=fee,
|
||||
)
|
||||
telegram, _freqtradebot, msg_mock = get_telegram_testobject(mocker, default_conf_usdt)
|
||||
|
||||
# Create some test data
|
||||
create_mock_trades_usdt(fee)
|
||||
# No trade id
|
||||
context = MagicMock()
|
||||
await telegram._list_custom_data(update=update, context=context)
|
||||
assert msg_mock.call_count == 1
|
||||
assert 'Trade-id not set.' in msg_mock.call_args_list[0][0][0]
|
||||
msg_mock.reset_mock()
|
||||
|
||||
#
|
||||
context.args = ['1']
|
||||
await telegram._list_custom_data(update=update, context=context)
|
||||
assert msg_mock.call_count == 1
|
||||
assert (
|
||||
"Didn't find any custom-data entries for Trade ID: `1`" in msg_mock.call_args_list[0][0][0]
|
||||
)
|
||||
msg_mock.reset_mock()
|
||||
|
||||
# Add some custom data
|
||||
trade1 = Trade.get_trades_proxy()[0]
|
||||
trade1.set_custom_data('test_int', 1)
|
||||
trade1.set_custom_data('test_dict', {'test': 'dict'})
|
||||
Trade.commit()
|
||||
context.args = [f"{trade1.id}"]
|
||||
await telegram._list_custom_data(update=update, context=context)
|
||||
assert msg_mock.call_count == 3
|
||||
assert "Found custom-data entries: " in msg_mock.call_args_list[0][0][0]
|
||||
assert (
|
||||
"*Key:* `test_int`\n*ID:* `1`\n*Trade ID:* `1`\n*Type:* `int`\n"
|
||||
"*Value:* `1`\n*Create Date:*") in msg_mock.call_args_list[1][0][0]
|
||||
assert (
|
||||
'*Key:* `test_dict`\n*ID:* `2`\n*Trade ID:* `1`\n*Type:* `dict`\n'
|
||||
'*Value:* `{"test": "dict"}`\n*Create Date:* `') in msg_mock.call_args_list[2][0][0]
|
||||
|
||||
msg_mock.reset_mock()
|
||||
|
||||
Reference in New Issue
Block a user