Merge remote-tracking branch 'upstream/develop' into feature/fetch-public-trades

This commit is contained in:
Joe Schr
2024-03-04 16:27:53 +01:00
40 changed files with 490 additions and 421 deletions

View File

@@ -482,12 +482,12 @@ jobs:
path: dist
- name: Publish to PyPI (Test)
uses: pypa/gh-action-pypi-publish@v1.8.11
uses: pypa/gh-action-pypi-publish@v1.8.12
with:
repository-url: https://test.pypi.org/legacy/
- name: Publish to PyPI
uses: pypa/gh-action-pypi-publish@v1.8.11
uses: pypa/gh-action-pypi-publish@v1.8.12
deploy-docker:

View File

@@ -450,6 +450,8 @@ If the trading range over the last 10 days is <1% or >99%, remove the pair from
]
```
Adding `"sort_direction": "asc"` or `"sort_direction": "desc"` enables sorting for this pairlist.
!!! Tip
This Filter can be used to automatically remove stable coin pairs, which have a very low trading range, and are therefore extremely difficult to trade with profit.
Additionally, it can also be used to automatically remove pairs with extreme high/low variance over a given amount of time.
@@ -460,7 +462,7 @@ Volatility is the degree of historical variation of a pairs over time, it is mea
This filter removes pairs if the average volatility over a `lookback_days` days is below `min_volatility` or above `max_volatility`. Since this is a filter that requires additional data, the results are cached for `refresh_period`.
This filter can be used to narrow down your pairs to a certain volatility or avoid very volatile pairs.
This filter can be used to narrow down your pairs to a certain volatility or avoid very volatile pairs.
In the below example:
If the volatility over the last 10 days is not in the range of 0.05-0.50, remove the pair from the whitelist. The filter is applied every 24h.
@@ -477,6 +479,8 @@ If the volatility over the last 10 days is not in the range of 0.05-0.50, remove
]
```
Adding `"sort_direction": "asc"` or `"sort_direction": "desc"` enables sorting mode for this pairlist.
### Full example of Pairlist Handlers
The below example blacklists `BNB/BTC`, uses `VolumePairList` with `20` assets, sorting pairs by `quoteVolume` and applies [`PrecisionFilter`](#precisionfilter) and [`PriceFilter`](#pricefilter), filtering all assets where 1 price unit is > 1%. Then the [`SpreadFilter`](#spreadfilter) and [`VolatilityFilter`](#volatilityfilter) is applied and pairs are finally shuffled with the random seed set to some predefined value.

View File

@@ -1,6 +1,6 @@
markdown==3.5.2
mkdocs==1.5.3
mkdocs-material==9.5.11
mkdocs-material==9.5.12
mdx_truly_sane_lists==1.3
pymdown-extensions==10.7
jinja2==3.1.3

View File

@@ -791,21 +791,21 @@ Returning a value more than the above (so remaining stake_amount would become ne
If you wish to buy additional orders with DCA, then make sure to leave enough funds in the wallet for that.
Using 'unlimited' stake amount with DCA orders requires you to also implement the `custom_stake_amount()` callback to avoid allocating all funds to the initial order.
!!! Warning
!!! Warning "Stoploss calculation"
Stoploss is still calculated from the initial opening price, not averaged price.
Regular stoploss rules still apply (cannot move down).
While `/stopentry` command stops the bot from entering new trades, the position adjustment feature will continue buying new orders on existing trades.
!!! Danger "Performance with many position adjustments"
Position adjustments can be a good approach to increase a strategy's output - but it can also have drawbacks if using this feature extensively.
Each of the orders will be attached to the trade object for the duration of the trade - hence increasing memory usage.
Trades with long duration and 10s or even 100ds of position adjustments are therefore not recommended, and should be closed at regular intervals to not affect performance.
!!! Warning "Backtesting"
During backtesting this callback is called for each candle in `timeframe` or `timeframe_detail`, so run-time performance will be affected.
This can also cause deviating results between live and backtesting, since backtesting can adjust the trade only once per candle, whereas live could adjust the trade multiple times per candle.
!!! Warning "Performance with many position adjustments"
Position adjustments can be a good approach to increase a strategy's output - but it can also have drawbacks if using this feature extensively.
Each of the orders will be attached to the trade object for the duration of the trade - hence increasing memory usage.
Trades with long duration and 10s or even 100ds of position adjustments are therefore not recommended, and should be closed at regular intervals to not affect performance.
``` python
from freqtrade.persistence import Trade

View File

@@ -6,7 +6,7 @@ To update your freqtrade installation, please use one of the below methods, corr
Breaking changes / changed behavior will be documented in the changelog that is posted alongside every release.
For the develop branch, please follow PR's to avoid being surprised by changes.
## docker
## Docker
!!! Note "Legacy installations using the `master` image"
We're switching from master to stable for the release Images - please adjust your docker-file and replace `freqtradeorg/freqtrade:master` with `freqtradeorg/freqtrade:stable`

View File

@@ -219,207 +219,49 @@ optional arguments:
-a, --all Print all exchanges known to the ccxt library.
```
* Example: see exchanges available for the bot:
Example: see exchanges available for the bot:
```
$ freqtrade list-exchanges
Exchanges available for Freqtrade:
Exchange name Valid reason
--------------- ------- --------------------------------------------
aax True
ascendex True missing opt: fetchMyTrades
bequant True
bibox True
bigone True
binance True
binanceus True
bitbank True missing opt: fetchTickers
bitcoincom True
bitfinex True
bitforex True missing opt: fetchMyTrades, fetchTickers
bitget True
bithumb True missing opt: fetchMyTrades
bitkk True missing opt: fetchMyTrades
bitmart True
bitmax True missing opt: fetchMyTrades
bitpanda True
bitvavo True
bitz True missing opt: fetchMyTrades
btcalpha True missing opt: fetchTicker, fetchTickers
btcmarkets True missing opt: fetchTickers
buda True missing opt: fetchMyTrades, fetchTickers
bw True missing opt: fetchMyTrades, fetchL2OrderBook
bybit True
bytetrade True
cdax True
cex True missing opt: fetchMyTrades
coinbaseprime True missing opt: fetchTickers
coinbasepro True missing opt: fetchTickers
coinex True
crex24 True
deribit True
digifinex True
equos True missing opt: fetchTicker, fetchTickers
eterbase True
fcoin True missing opt: fetchMyTrades, fetchTickers
fcoinjp True missing opt: fetchMyTrades, fetchTickers
gateio True
gemini True
gopax True
hbtc True
hitbtc True
huobijp True
huobipro True
idex True
kraken True
kucoin True
lbank True missing opt: fetchMyTrades
mercado True missing opt: fetchTickers
ndax True missing opt: fetchTickers
novadax True
okcoin True
okex True
probit True
qtrade True
stex True
timex True
upbit True missing opt: fetchMyTrades
vcc True
zb True missing opt: fetchMyTrades
Exchange name Supported Markets Reason
------------------ ----------- ---------------------- ------------------------------------------------------------------------
binance Official spot, isolated futures
bitmart Official spot
bybit spot, isolated futures
gate Official spot, isolated futures
htx Official spot
huobi spot
kraken Official spot
okx Official spot, isolated futures
```
!!! info ""
Output reduced for clarity - supported and available exchanges may change over time.
!!! Note "missing opt exchanges"
Values with "missing opt:" might need special configuration (e.g. using orderbook if `fetchTickers` is missing) - but should in theory work (although we cannot guarantee they will).
* Example: see all exchanges supported by the ccxt library (including 'bad' ones, i.e. those that are known to not work with Freqtrade):
Example: see all exchanges supported by the ccxt library (including 'bad' ones, i.e. those that are known to not work with Freqtrade)
```
$ freqtrade list-exchanges -a
All exchanges supported by the ccxt library:
Exchange name Valid reason
------------------ ------- ---------------------------------------------------------------------------------------
aax True
aofex False missing: fetchOrder
ascendex True missing opt: fetchMyTrades
bequant True
bibox True
bigone True
binance True
binanceus True
bit2c False missing: fetchOrder, fetchOHLCV
bitbank True missing opt: fetchTickers
bitbay False missing: fetchOrder
bitcoincom True
bitfinex True
bitfinex2 False missing: fetchOrder
bitflyer False missing: fetchOrder, fetchOHLCV
bitforex True missing opt: fetchMyTrades, fetchTickers
bitget True
bithumb True missing opt: fetchMyTrades
bitkk True missing opt: fetchMyTrades
bitmart True
bitmax True missing opt: fetchMyTrades
bitmex False Various reasons.
bitpanda True
bitso False missing: fetchOHLCV
bitstamp True missing opt: fetchTickers
bitstamp1 False missing: fetchOrder, fetchOHLCV
bitvavo True
bitz True missing opt: fetchMyTrades
bl3p False missing: fetchOrder, fetchOHLCV
bleutrade False missing: fetchOrder
braziliex False missing: fetchOHLCV
btcalpha True missing opt: fetchTicker, fetchTickers
btcbox False missing: fetchOHLCV
btcmarkets True missing opt: fetchTickers
btctradeua False missing: fetchOrder, fetchOHLCV
btcturk False missing: fetchOrder
buda True missing opt: fetchMyTrades, fetchTickers
bw True missing opt: fetchMyTrades, fetchL2OrderBook
bybit True
bytetrade True
cdax True
cex True missing opt: fetchMyTrades
chilebit False missing: fetchOrder, fetchOHLCV
coinbase False missing: fetchOrder, cancelOrder, createOrder, fetchOHLCV
coinbaseprime True missing opt: fetchTickers
coinbasepro True missing opt: fetchTickers
coincheck False missing: fetchOrder, fetchOHLCV
coinegg False missing: fetchOHLCV
coinex True
coinfalcon False missing: fetchOHLCV
coinfloor False missing: fetchOrder, fetchOHLCV
coingi False missing: fetchOrder, fetchOHLCV
coinmarketcap False missing: fetchOrder, cancelOrder, createOrder, fetchBalance, fetchOHLCV
coinmate False missing: fetchOHLCV
coinone False missing: fetchOHLCV
coinspot False missing: fetchOrder, cancelOrder, fetchOHLCV
crex24 True
currencycom False missing: fetchOrder
delta False missing: fetchOrder
deribit True
digifinex True
equos True missing opt: fetchTicker, fetchTickers
eterbase True
exmo False missing: fetchOrder
exx False missing: fetchOHLCV
fcoin True missing opt: fetchMyTrades, fetchTickers
fcoinjp True missing opt: fetchMyTrades, fetchTickers
flowbtc False missing: fetchOrder, fetchOHLCV
foxbit False missing: fetchOrder, fetchOHLCV
gateio True
gemini True
gopax True
hbtc True
hitbtc True
hollaex False missing: fetchOrder
huobijp True
huobipro True
idex True
independentreserve False missing: fetchOHLCV
indodax False missing: fetchOHLCV
itbit False missing: fetchOHLCV
kraken True
kucoin True
kuna False missing: fetchOHLCV
lakebtc False missing: fetchOrder, fetchOHLCV
latoken False missing: fetchOrder, fetchOHLCV
lbank True missing opt: fetchMyTrades
liquid False missing: fetchOHLCV
luno False missing: fetchOHLCV
lykke False missing: fetchOHLCV
mercado True missing opt: fetchTickers
mixcoins False missing: fetchOrder, fetchOHLCV
ndax True missing opt: fetchTickers
novadax True
oceanex False missing: fetchOHLCV
okcoin True
okex True
paymium False missing: fetchOrder, fetchOHLCV
phemex False Does not provide history.
poloniex False missing: fetchOrder
probit True
qtrade True
rightbtc False missing: fetchOrder
ripio False missing: fetchOHLCV
southxchange False missing: fetchOrder, fetchOHLCV
stex True
surbitcoin False missing: fetchOrder, fetchOHLCV
therock False missing: fetchOHLCV
tidebit False missing: fetchOrder
tidex False missing: fetchOHLCV
timex True
upbit True missing opt: fetchMyTrades
vbtc False missing: fetchOrder, fetchOHLCV
vcc True
wavesexchange False missing: fetchOrder
whitebit False missing: fetchOrder, cancelOrder, createOrder, fetchBalance
xbtce False missing: fetchOrder, fetchOHLCV
xena False missing: fetchOrder
yobit False missing: fetchOHLCV
zaif False missing: fetchOrder, fetchOHLCV
zb True missing opt: fetchMyTrades
Exchange name Valid Supported Markets Reason
------------------ ------- ----------- ---------------------- ---------------------------------------------------------------------------------
binance True Official spot, isolated futures
bitflyer False spot missing: fetchOrder. missing opt: fetchTickers.
bitmart True Official spot
bybit True spot, isolated futures
gate True Official spot, isolated futures
htx True Official spot
kraken True Official spot
okx True Official spot, isolated futures
```
!!! info ""
Reduced output - supported and available exchanges may change over time.
## List Timeframes
Use the `list-timeframes` subcommand to see the list of timeframes available for the exchange.

View File

@@ -1,5 +1,5 @@
""" Freqtrade bot """
__version__ = '2024.2-dev'
__version__ = '2024.3-dev'
if 'dev' in __version__:
from pathlib import Path

View File

@@ -69,7 +69,8 @@ ARGS_CONVERT_DATA_TRADES = ["pairs", "format_from_trades", "format_to", "erase",
ARGS_CONVERT_DATA = ["pairs", "format_from", "format_to", "erase", "exchange"]
ARGS_CONVERT_DATA_OHLCV = ARGS_CONVERT_DATA + ["timeframes", "trading_mode", "candle_types"]
ARGS_CONVERT_TRADES = ["pairs", "timeframes", "exchange", "dataformat_ohlcv", "dataformat_trades"]
ARGS_CONVERT_TRADES = ["pairs", "timeframes", "exchange", "dataformat_ohlcv", "dataformat_trades",
"trading_mode"]
ARGS_LIST_DATA = ["exchange", "dataformat_ohlcv", "pairs", "trading_mode", "show_timerange"]

View File

@@ -8,9 +8,10 @@ from freqtrade.constants import DATETIME_PRINT_FORMAT, DL_DATA_TIMEFRAMES, Confi
from freqtrade.data.converter import (convert_ohlcv_format, convert_trades_format,
convert_trades_to_ohlcv)
from freqtrade.data.history import download_data_main
from freqtrade.enums import RunMode, TradingMode
from freqtrade.enums import CandleType, RunMode, TradingMode
from freqtrade.exceptions import OperationalException
from freqtrade.exchange import timeframe_to_minutes
from freqtrade.plugins.pairlist.pairlist_helpers import dynamic_expand_pairlist
from freqtrade.resolvers import ExchangeResolver
from freqtrade.util.migrations import migrate_data
@@ -62,13 +63,21 @@ def start_convert_trades(args: Dict[str, Any]) -> None:
for timeframe in config['timeframes']:
exchange.validate_timeframes(timeframe)
available_pairs = [
p for p in exchange.get_markets(
tradable_only=True, active_only=not config.get('include_inactive')
).keys()
]
expanded_pairs = dynamic_expand_pairlist(config, available_pairs)
# Convert downloaded trade data to different timeframes
convert_trades_to_ohlcv(
pairs=config.get('pairs', []), timeframes=config['timeframes'],
pairs=expanded_pairs, timeframes=config['timeframes'],
datadir=config['datadir'], timerange=timerange, erase=bool(config.get('erase')),
data_format_ohlcv=config['dataformat_ohlcv'],
data_format_trades=config['dataformat_trades'],
candle_type=config.get('candle_type_def', CandleType.SPOT)
)

View File

@@ -11,7 +11,7 @@ from pandas import DataFrame, to_datetime
from freqtrade.configuration import TimeRange
from freqtrade.constants import (DEFAULT_DATAFRAME_COLUMNS, DEFAULT_TRADES_COLUMNS, TRADES_DTYPES,
Config, TradeList)
from freqtrade.enums import CandleType
from freqtrade.enums import CandleType, TradingMode
from freqtrade.exceptions import OperationalException
@@ -88,10 +88,10 @@ def convert_trades_to_ohlcv(
timeframes: List[str],
datadir: Path,
timerange: TimeRange,
erase: bool = False,
data_format_ohlcv: str = 'feather',
data_format_trades: str = 'feather',
candle_type: CandleType = CandleType.SPOT
erase: bool,
data_format_ohlcv: str,
data_format_trades: str,
candle_type: CandleType,
) -> None:
"""
Convert stored trades data to ohlcv data
@@ -99,14 +99,12 @@ def convert_trades_to_ohlcv(
from freqtrade.data.history.idatahandler import get_datahandler
data_handler_trades = get_datahandler(datadir, data_format=data_format_trades)
data_handler_ohlcv = get_datahandler(datadir, data_format=data_format_ohlcv)
if not pairs:
pairs = data_handler_trades.trades_get_pairs(datadir)
logger.info(f"About to convert pairs: '{', '.join(pairs)}', "
f"intervals: '{', '.join(timeframes)}' to {datadir}")
trading_mode = TradingMode.FUTURES if candle_type != CandleType.SPOT else TradingMode.SPOT
for pair in pairs:
trades = data_handler_trades.trades_load(pair)
trades = data_handler_trades.trades_load(pair, trading_mode)
for timeframe in timeframes:
if erase:
if data_handler_ohlcv.ohlcv_purge(pair, timeframe, candle_type=candle_type):
@@ -116,7 +114,7 @@ def convert_trades_to_ohlcv(
# Store ohlcv
data_handler_ohlcv.ohlcv_store(pair, timeframe, data=ohlcv, candle_type=candle_type)
except ValueError:
logger.exception(f'Could not convert {pair} to OHLCV.')
logger.warning(f'Could not convert {pair} to OHLCV.')
def convert_trades_format(config: Config, convert_from: str, convert_to: str, erase: bool):
@@ -144,11 +142,12 @@ def convert_trades_format(config: Config, convert_from: str, convert_to: str, er
if 'pairs' not in config:
config['pairs'] = src.trades_get_pairs(config['datadir'])
logger.info(f"Converting trades for {config['pairs']}")
trading_mode: TradingMode = config.get('trading_mode', TradingMode.SPOT)
for pair in config['pairs']:
data = src.trades_load(pair=pair)
data = src.trades_load(pair, trading_mode)
logger.info(f"Converting {len(data)} trades for {pair}")
trg.trades_store(pair, data)
trg.trades_store(pair, data, trading_mode)
if erase and convert_from != convert_to:
logger.info(f"Deleting source Trade data for {pair}.")
src.trades_purge(pair=pair)
src.trades_purge(pair, trading_mode)

View File

@@ -7,6 +7,7 @@ from freqtrade.constants import DATETIME_PRINT_FORMAT, DEFAULT_TRADES_COLUMNS, C
from freqtrade.data.converter.trade_converter import (trades_convert_types,
trades_df_remove_duplicates)
from freqtrade.data.history.idatahandler import get_datahandler
from freqtrade.enums import TradingMode
from freqtrade.exceptions import OperationalException
from freqtrade.plugins.pairlist.pairlist_helpers import expand_pairlist
from freqtrade.resolvers import ExchangeResolver
@@ -79,4 +80,4 @@ def import_kraken_trades_from_csv(config: Config, convert_to: str):
f"{trades_df['date'].min():{DATETIME_PRINT_FORMAT}} to "
f"{trades_df['date'].max():{DATETIME_PRINT_FORMAT}}")
data_handler.trades_store(pair, trades_df)
data_handler.trades_store(pair, trades_df, TradingMode.SPOT)

View File

@@ -15,7 +15,7 @@ from freqtrade.configuration import TimeRange
from freqtrade.constants import (FULL_DATAFRAME_THRESHOLD, Config, ListPairsWithTimeframes,
PairWithTimeframe)
from freqtrade.data.history import get_datahandler, load_pair_history
from freqtrade.enums import CandleType, RPCMessageType, RunMode
from freqtrade.enums import CandleType, RPCMessageType, RunMode, TradingMode
from freqtrade.exceptions import ExchangeError, OperationalException
from freqtrade.exchange import Exchange, timeframe_to_prev_date, timeframe_to_seconds
from freqtrade.exchange.types import OrderBook
@@ -528,7 +528,7 @@ class DataProvider:
candle_type) if candle_type != '' else self._config['candle_type_def']
data_handler = get_datahandler(
self._config['datadir'], data_format=self._config['dataformat_trades'])
trades_df = data_handler.trades_load(pair)
trades_df = data_handler.trades_load(pair, TradingMode.FUTURES)
return trades_df
else:

View File

@@ -5,7 +5,7 @@ from pandas import DataFrame, read_feather, to_datetime
from freqtrade.configuration import TimeRange
from freqtrade.constants import DEFAULT_DATAFRAME_COLUMNS, DEFAULT_TRADES_COLUMNS
from freqtrade.enums import CandleType
from freqtrade.enums import CandleType, TradingMode
from .idatahandler import IDataHandler
@@ -82,14 +82,15 @@ class FeatherDataHandler(IDataHandler):
"""
raise NotImplementedError()
def _trades_store(self, pair: str, data: DataFrame) -> None:
def _trades_store(self, pair: str, data: DataFrame, trading_mode: TradingMode) -> None:
"""
Store trades data (list of Dicts) to file
:param pair: Pair - used for filename
:param data: Dataframe containing trades
column sequence as in DEFAULT_TRADES_COLUMNS
:param trading_mode: Trading mode to use (used to determine the filename)
"""
filename = self._pair_trades_filename(self._datadir, pair)
filename = self._pair_trades_filename(self._datadir, pair, trading_mode)
self.create_dir_if_needed(filename)
data.reset_index(drop=True).to_feather(filename, compression_level=9, compression='lz4')
@@ -102,15 +103,18 @@ class FeatherDataHandler(IDataHandler):
"""
raise NotImplementedError()
def _trades_load(self, pair: str, timerange: Optional[TimeRange] = None) -> DataFrame:
def _trades_load(
self, pair: str, trading_mode: TradingMode, timerange: Optional[TimeRange] = None
) -> DataFrame:
"""
Load a pair from file, either .json.gz or .json
# TODO: respect timerange ...
:param pair: Load trades for this pair
:param trading_mode: Trading mode to use (used to determine the filename)
:param timerange: Timerange to load trades for - currently not implemented
:return: Dataframe containing trades
"""
filename = self._pair_trades_filename(self._datadir, pair)
filename = self._pair_trades_filename(self._datadir, pair, trading_mode)
if not filename.exists():
return DataFrame(columns=DEFAULT_TRADES_COLUMNS)

View File

@@ -6,7 +6,7 @@ import pandas as pd
from freqtrade.configuration import TimeRange
from freqtrade.constants import DEFAULT_DATAFRAME_COLUMNS, DEFAULT_TRADES_COLUMNS
from freqtrade.enums import CandleType
from freqtrade.enums import CandleType, TradingMode
from .idatahandler import IDataHandler
@@ -35,7 +35,7 @@ class HDF5DataHandler(IDataHandler):
self.create_dir_if_needed(filename)
_data.loc[:, self._columns].to_hdf(
filename, key, mode='a', complevel=9, complib='blosc',
filename, key=key, mode='a', complevel=9, complib='blosc',
format='table', data_columns=['date']
)
@@ -100,17 +100,18 @@ class HDF5DataHandler(IDataHandler):
"""
raise NotImplementedError()
def _trades_store(self, pair: str, data: pd.DataFrame) -> None:
def _trades_store(self, pair: str, data: pd.DataFrame, trading_mode: TradingMode) -> None:
"""
Store trades data (list of Dicts) to file
:param pair: Pair - used for filename
:param data: Dataframe containing trades
column sequence as in DEFAULT_TRADES_COLUMNS
:param trading_mode: Trading mode to use (used to determine the filename)
"""
key = self._pair_trades_key(pair)
data.to_hdf(
self._pair_trades_filename(self._datadir, pair), key,
self._pair_trades_filename(self._datadir, pair, trading_mode), key=key,
mode='a', complevel=9, complib='blosc',
format='table', data_columns=['timestamp']
)
@@ -124,15 +125,18 @@ class HDF5DataHandler(IDataHandler):
"""
raise NotImplementedError()
def _trades_load(self, pair: str, timerange: Optional[TimeRange] = None) -> pd.DataFrame:
def _trades_load(
self, pair: str, trading_mode: TradingMode, timerange: Optional[TimeRange] = None
) -> pd.DataFrame:
"""
Load a pair from h5 file.
:param pair: Load trades for this pair
:param trading_mode: Trading mode to use (used to determine the filename)
:param timerange: Timerange to load trades for - currently not implemented
:return: Dataframe containing trades
"""
key = self._pair_trades_key(pair)
filename = self._pair_trades_filename(self._datadir, pair)
filename = self._pair_trades_filename(self._datadir, pair, trading_mode)
if not filename.exists():
return pd.DataFrame(columns=DEFAULT_TRADES_COLUMNS)

View File

@@ -13,7 +13,7 @@ from freqtrade.data.converter import (clean_ohlcv_dataframe, convert_trades_to_o
ohlcv_to_dataframe, trades_df_remove_duplicates,
trades_list_to_df)
from freqtrade.data.history.idatahandler import IDataHandler, get_datahandler
from freqtrade.enums import CandleType
from freqtrade.enums import CandleType, TradingMode
from freqtrade.exceptions import OperationalException
from freqtrade.exchange import Exchange
from freqtrade.plugins.pairlist.pairlist_helpers import dynamic_expand_pairlist
@@ -333,7 +333,8 @@ def _download_trades_history(exchange: Exchange,
pair: str, *,
new_pairs_days: int = 30,
timerange: Optional[TimeRange] = None,
data_handler: IDataHandler
data_handler: IDataHandler,
trading_mode: TradingMode,
) -> bool:
"""
Download trade history from the exchange.
@@ -349,7 +350,7 @@ def _download_trades_history(exchange: Exchange,
if timerange.stoptype == 'date':
until = timerange.stopts * 1000
trades = data_handler.trades_load(pair)
trades = data_handler.trades_load(pair, trading_mode)
# TradesList columns are defined in constants.DEFAULT_TRADES_COLUMNS
# DEFAULT_TRADES_COLUMNS: 0 -> timestamp
@@ -388,7 +389,7 @@ def _download_trades_history(exchange: Exchange,
trades = concat([trades, new_trades_df], axis=0)
# Remove duplicates to make sure we're not storing data we don't need
trades = trades_df_remove_duplicates(trades)
data_handler.trades_store(pair, data=trades)
data_handler.trades_store(pair, trades, trading_mode)
logger.debug("New Start: %s", 'None' if trades.empty else
f"{trades.iloc[0]['date']:{DATETIME_PRINT_FORMAT}}")
@@ -405,8 +406,10 @@ def _download_trades_history(exchange: Exchange,
def refresh_backtest_trades_data(exchange: Exchange, pairs: List[str], datadir: Path,
timerange: TimeRange, new_pairs_days: int = 30,
erase: bool = False, data_format: str = 'feather') -> List[str]:
timerange: TimeRange, trading_mode: TradingMode,
new_pairs_days: int = 30,
erase: bool = False, data_format: str = 'feather',
) -> List[str]:
"""
Refresh stored trades data for backtesting and hyperopt operations.
Used by freqtrade download-data subcommand.
@@ -421,7 +424,7 @@ def refresh_backtest_trades_data(exchange: Exchange, pairs: List[str], datadir:
continue
if erase:
if data_handler.trades_purge(pair):
if data_handler.trades_purge(pair, trading_mode):
logger.info(f'Deleting existing data for pair {pair}.')
logger.info(f'Downloading trades for pair {pair}.')
@@ -429,7 +432,8 @@ def refresh_backtest_trades_data(exchange: Exchange, pairs: List[str], datadir:
pair=pair,
new_pairs_days=new_pairs_days,
timerange=timerange,
data_handler=data_handler)
data_handler=data_handler,
trading_mode=trading_mode)
return pairs_not_available
@@ -519,7 +523,9 @@ def download_data_main(config: Config) -> None:
pairs_not_available = refresh_backtest_trades_data(
exchange, pairs=expanded_pairs, datadir=config['datadir'],
timerange=timerange, new_pairs_days=config['new_pairs_days'],
erase=bool(config.get('erase')), data_format=config['dataformat_trades'])
erase=bool(config.get('erase')), data_format=config['dataformat_trades'],
trading_mode=config.get('trading_mode', TradingMode.SPOT),
)
# Convert downloaded trade data to different timeframes
convert_trades_to_ohlcv(
@@ -527,6 +533,7 @@ def download_data_main(config: Config) -> None:
datadir=config['datadir'], timerange=timerange, erase=bool(config.get('erase')),
data_format_ohlcv=config['dataformat_ohlcv'],
data_format_trades=config['dataformat_trades'],
candle_type=config.get('candle_type_def', CandleType.SPOT),
)
else:
if not exchange.get_option('ohlcv_has_history', True):

View File

@@ -172,12 +172,13 @@ class IDataHandler(ABC):
return [cls.rebuild_pair_from_filename(match[0]) for match in _tmp if match]
@abstractmethod
def _trades_store(self, pair: str, data: DataFrame) -> None:
def _trades_store(self, pair: str, data: DataFrame, trading_mode: TradingMode) -> None:
"""
Store trades data (list of Dicts) to file
:param pair: Pair - used for filename
:param data: Dataframe containing trades
column sequence as in DEFAULT_TRADES_COLUMNS
:param trading_mode: Trading mode to use (used to determine the filename)
"""
@abstractmethod
@@ -190,45 +191,55 @@ class IDataHandler(ABC):
"""
@abstractmethod
def _trades_load(self, pair: str, timerange: Optional[TimeRange] = None) -> DataFrame:
def _trades_load(
self, pair: str, trading_mode: TradingMode, timerange: Optional[TimeRange] = None
) -> DataFrame:
"""
Load a pair from file, either .json.gz or .json
:param pair: Load trades for this pair
:param trading_mode: Trading mode to use (used to determine the filename)
:param timerange: Timerange to load trades for - currently not implemented
:return: Dataframe containing trades
"""
def trades_store(self, pair: str, data: DataFrame) -> None:
def trades_store(self, pair: str, data: DataFrame, trading_mode: TradingMode) -> None:
"""
Store trades data (list of Dicts) to file
:param pair: Pair - used for filename
:param data: Dataframe containing trades
column sequence as in DEFAULT_TRADES_COLUMNS
:param trading_mode: Trading mode to use (used to determine the filename)
"""
# Filter on expected columns (will remove the actual date column).
self._trades_store(pair, data[DEFAULT_TRADES_COLUMNS])
self._trades_store(pair, data[DEFAULT_TRADES_COLUMNS], trading_mode)
def trades_purge(self, pair: str) -> bool:
def trades_purge(self, pair: str, trading_mode: TradingMode) -> bool:
"""
Remove data for this pair
:param pair: Delete data for this pair.
:param trading_mode: Trading mode to use (used to determine the filename)
:return: True when deleted, false if file did not exist.
"""
filename = self._pair_trades_filename(self._datadir, pair)
filename = self._pair_trades_filename(self._datadir, pair, trading_mode)
if filename.exists():
filename.unlink()
return True
return False
def trades_load(self, pair: str, timerange: Optional[TimeRange] = None) -> DataFrame:
def trades_load(
self, pair: str, trading_mode: TradingMode, timerange: Optional[TimeRange] = None
) -> DataFrame:
"""
Load a pair from file, either .json.gz or .json
Removes duplicates in the process.
:param pair: Load trades for this pair
:param trading_mode: Trading mode to use (used to determine the filename)
:param timerange: Timerange to load trades for - currently not implemented
:return: List of trades
"""
trades = trades_df_remove_duplicates(self._trades_load(pair, timerange=timerange))
trades = trades_df_remove_duplicates(
self._trades_load(pair, trading_mode, timerange=timerange)
)
trades = trades_convert_types(trades)
return trades
@@ -264,8 +275,12 @@ class IDataHandler(ABC):
return filename
@classmethod
def _pair_trades_filename(cls, datadir: Path, pair: str) -> Path:
def _pair_trades_filename(cls, datadir: Path, pair: str, trading_mode: TradingMode) -> Path:
pair_s = misc.pair_to_filename(pair)
if trading_mode == TradingMode.FUTURES:
# Futures pair ...
datadir = datadir.joinpath('futures')
filename = datadir.joinpath(f'{pair_s}-trades.{cls._get_file_extension()}')
return filename

View File

@@ -8,7 +8,7 @@ from freqtrade import misc
from freqtrade.configuration import TimeRange
from freqtrade.constants import DEFAULT_DATAFRAME_COLUMNS, DEFAULT_TRADES_COLUMNS
from freqtrade.data.converter import trades_dict_to_list, trades_list_to_df
from freqtrade.enums import CandleType
from freqtrade.enums import CandleType, TradingMode
from .idatahandler import IDataHandler
@@ -37,7 +37,7 @@ class JsonDataHandler(IDataHandler):
self.create_dir_if_needed(filename)
_data = data.copy()
# Convert date to int
_data['date'] = _data['date'].view(np.int64) // 1000 // 1000
_data['date'] = _data['date'].astype(np.int64) // 1000 // 1000
# Reset index, select only appropriate columns and save as json
_data.reset_index(drop=True).loc[:, self._columns].to_json(
@@ -94,14 +94,15 @@ class JsonDataHandler(IDataHandler):
"""
raise NotImplementedError()
def _trades_store(self, pair: str, data: DataFrame) -> None:
def _trades_store(self, pair: str, data: DataFrame, trading_mode: TradingMode) -> None:
"""
Store trades data (list of Dicts) to file
:param pair: Pair - used for filename
:param data: Dataframe containing trades
column sequence as in DEFAULT_TRADES_COLUMNS
:param trading_mode: Trading mode to use (used to determine the filename)
"""
filename = self._pair_trades_filename(self._datadir, pair)
filename = self._pair_trades_filename(self._datadir, pair, trading_mode)
trades = data.values.tolist()
misc.file_dump_json(filename, trades, is_zip=self._use_zip)
@@ -114,15 +115,18 @@ class JsonDataHandler(IDataHandler):
"""
raise NotImplementedError()
def _trades_load(self, pair: str, timerange: Optional[TimeRange] = None) -> DataFrame:
def _trades_load(
self, pair: str, trading_mode: TradingMode, timerange: Optional[TimeRange] = None
) -> DataFrame:
"""
Load a pair from file, either .json.gz or .json
# TODO: respect timerange ...
:param pair: Load trades for this pair
:param trading_mode: Trading mode to use (used to determine the filename)
:param timerange: Timerange to load trades for - currently not implemented
:return: Dataframe containing trades
"""
filename = self._pair_trades_filename(self._datadir, pair)
filename = self._pair_trades_filename(self._datadir, pair, trading_mode)
tradesdata = misc.file_load_json(filename)
if not tradesdata:

View File

@@ -4,8 +4,8 @@ from typing import Optional
from pandas import DataFrame, read_parquet, to_datetime
from freqtrade.configuration import TimeRange
from freqtrade.constants import DEFAULT_DATAFRAME_COLUMNS, DEFAULT_TRADES_COLUMNS, TradeList
from freqtrade.enums import CandleType
from freqtrade.constants import DEFAULT_DATAFRAME_COLUMNS, DEFAULT_TRADES_COLUMNS
from freqtrade.enums import CandleType, TradingMode
from .idatahandler import IDataHandler
@@ -81,14 +81,15 @@ class ParquetDataHandler(IDataHandler):
"""
raise NotImplementedError()
def _trades_store(self, pair: str, data: DataFrame) -> None:
def _trades_store(self, pair: str, data: DataFrame, trading_mode: TradingMode) -> None:
"""
Store trades data (list of Dicts) to file
:param pair: Pair - used for filename
:param data: Dataframe containing trades
column sequence as in DEFAULT_TRADES_COLUMNS
:param trading_mode: Trading mode to use (used to determine the filename)
"""
filename = self._pair_trades_filename(self._datadir, pair)
filename = self._pair_trades_filename(self._datadir, pair, trading_mode)
self.create_dir_if_needed(filename)
data.reset_index(drop=True).to_parquet(filename)
@@ -101,15 +102,18 @@ class ParquetDataHandler(IDataHandler):
"""
raise NotImplementedError()
def _trades_load(self, pair: str, timerange: Optional[TimeRange] = None) -> TradeList:
def _trades_load(
self, pair: str, trading_mode: TradingMode, timerange: Optional[TimeRange] = None
) -> DataFrame:
"""
Load a pair from file, either .json.gz or .json
# TODO: respect timerange ...
:param pair: Load trades for this pair
:param trading_mode: Trading mode to use (used to determine the filename)
:param timerange: Timerange to load trades for - currently not implemented
:return: List of trades
"""
filename = self._pair_trades_filename(self._datadir, pair)
filename = self._pair_trades_filename(self._datadir, pair, trading_mode)
if not filename.exists():
return DataFrame(columns=DEFAULT_TRADES_COLUMNS)

View File

@@ -8,7 +8,7 @@ import logging
import signal
from copy import deepcopy
from datetime import datetime, timedelta, timezone
from math import floor
from math import floor, isnan
from threading import Lock
from typing import Any, Coroutine, Dict, List, Literal, Optional, Tuple, Union
@@ -3053,7 +3053,7 @@ class Exchange:
else:
# Fill up missing funding_rate candles with fallback value
combined = mark_rates.merge(
funding_rates, on='date', how="outer", suffixes=["_mark", "_fund"]
funding_rates, on='date', how="left", suffixes=["_mark", "_fund"]
)
combined['open_fund'] = combined['open_fund'].fillna(futures_funding_rate)
return combined
@@ -3082,7 +3082,8 @@ class Exchange:
if not df.empty:
df1 = df[(df['date'] >= open_date) & (df['date'] <= close_date)]
fees = sum(df1['open_fund'] * df1['open_mark'] * amount)
if isnan(fees):
fees = 0.0
# Negate fees for longs as funding_fees expects it this way based on live endpoints.
return fees if is_short else -fees

View File

@@ -36,8 +36,15 @@ class XGBoostRegressor(BaseRegressionModel):
eval_set = None
eval_weights = None
else:
eval_set = [(data_dictionary["test_features"], data_dictionary["test_labels"])]
eval_weights = [data_dictionary['test_weights']]
eval_set = [
(data_dictionary["test_features"],
data_dictionary["test_labels"]),
(X, y)
]
eval_weights = [
data_dictionary['test_weights'],
data_dictionary['train_weights']
]
sample_weight = data_dictionary["train_weights"]

View File

@@ -43,13 +43,11 @@ class TensorBoardCallback(BaseTensorBoardCallback):
if not evals_log:
return False
for data, metric in evals_log.items():
for metric_name, log in metric.items():
evals = ["validation", "train"]
for metric, eval in zip(evals_log.items(), evals):
for metric_name, log in metric[1].items():
score = log[-1][0] if isinstance(log[-1], tuple) else log[-1]
if data == "train":
self.writer.add_scalar("train_loss", score, epoch)
else:
self.writer.add_scalar("valid_loss", score, epoch)
self.writer.add_scalar(f"{eval}-{metric_name}", score, epoch)
return False

View File

@@ -107,9 +107,9 @@ class LookaheadAnalysisSubFunctions:
csv_df = add_or_update_row(csv_df, new_row_data)
# Fill NaN values with a default value (e.g., 0)
csv_df['total_signals'] = csv_df['total_signals'].fillna(0)
csv_df['biased_entry_signals'] = csv_df['biased_entry_signals'].fillna(0)
csv_df['biased_exit_signals'] = csv_df['biased_exit_signals'].fillna(0)
csv_df['total_signals'] = csv_df['total_signals'].astype(int).fillna(0)
csv_df['biased_entry_signals'] = csv_df['biased_entry_signals'].astype(int).fillna(0)
csv_df['biased_exit_signals'] = csv_df['biased_exit_signals'].astype(int).fillna(0)
# Convert columns to integers
csv_df['total_signals'] = csv_df['total_signals'].astype(int)

View File

@@ -215,7 +215,7 @@ def _get_resample_from_period(period: str) -> str:
# Weekly defaulting to Monday.
return '1W-MON'
if period == 'month':
return '1M'
return '1ME'
raise ValueError(f"Period {period} is not supported.")

View File

@@ -3,7 +3,6 @@ Volatility pairlist filter
"""
import logging
import sys
from copy import deepcopy
from datetime import timedelta
from typing import Any, Dict, List, Optional
@@ -37,6 +36,7 @@ class VolatilityFilter(IPairList):
self._max_volatility = pairlistconfig.get('max_volatility', sys.maxsize)
self._refresh_period = pairlistconfig.get('refresh_period', 1440)
self._def_candletype = self._config['candle_type_def']
self._sort_direction: Optional[str] = pairlistconfig.get('sort_direction', None)
self._pair_cache: TTLCache = TTLCache(maxsize=1000, ttl=self._refresh_period)
@@ -46,6 +46,9 @@ class VolatilityFilter(IPairList):
if self._days > candle_limit:
raise OperationalException("VolatilityFilter requires lookback_days to not "
f"exceed exchange max request size ({candle_limit})")
if self._sort_direction not in [None, 'asc', 'desc']:
raise OperationalException("VolatilityFilter requires sort_direction to be "
"either None (undefined), 'asc' or 'desc'")
@property
def needstickers(self) -> bool:
@@ -89,6 +92,13 @@ class VolatilityFilter(IPairList):
"description": "Maximum Volatility",
"help": "Maximum volatility a pair must have to be considered.",
},
"sort_direction": {
"type": "option",
"default": None,
"options": ["", "asc", "desc"],
"description": "Sort pairlist",
"help": "Sort Pairlist ascending or descending by volatility.",
},
**IPairList.refresh_period_parameter()
}
@@ -105,43 +115,61 @@ class VolatilityFilter(IPairList):
since_ms = dt_ts(dt_floor_day(dt_now()) - timedelta(days=self._days))
candles = self._exchange.refresh_ohlcv_with_cache(needed_pairs, since_ms=since_ms)
if self._enabled:
for p in deepcopy(pairlist):
daily_candles = candles[(p, '1d', self._def_candletype)] if (
p, '1d', self._def_candletype) in candles else None
if not self._validate_pair_loc(p, daily_candles):
pairlist.remove(p)
return pairlist
resulting_pairlist: List[str] = []
volatilitys: Dict[str, float] = {}
for p in pairlist:
daily_candles = candles.get((p, '1d', self._def_candletype), None)
def _validate_pair_loc(self, pair: str, daily_candles: Optional[DataFrame]) -> bool:
"""
Validate trading range
:param pair: Pair that's currently validated
:param daily_candles: Downloaded daily candles
:return: True if the pair can stay, false if it should be removed
"""
volatility_avg = self._calculate_volatility(p, daily_candles)
if volatility_avg is not None:
if self._validate_pair_loc(p, volatility_avg):
resulting_pairlist.append(p)
volatilitys[p] = (
volatility_avg if volatility_avg and not np.isnan(volatility_avg) else 0
)
else:
self.log_once(f"Removed {p} from whitelist, no candles found.", logger.info)
if self._sort_direction:
resulting_pairlist = sorted(resulting_pairlist,
key=lambda p: volatilitys[p],
reverse=self._sort_direction == 'desc')
return resulting_pairlist
def _calculate_volatility(self, pair: str, daily_candles: DataFrame) -> Optional[float]:
# Check symbol in cache
if (cached_res := self._pair_cache.get(pair, None)) is not None:
return cached_res
if (volatility_avg := self._pair_cache.get(pair, None)) is not None:
return volatility_avg
result = False
if daily_candles is not None and not daily_candles.empty:
returns = (np.log(daily_candles["close"].shift(1) / daily_candles["close"]))
returns.fillna(0, inplace=True)
volatility_series = returns.rolling(window=self._days).std() * np.sqrt(self._days)
volatility_avg = volatility_series.mean()
self._pair_cache[pair] = volatility_avg
if self._min_volatility <= volatility_avg <= self._max_volatility:
result = True
else:
self.log_once(f"Removed {pair} from whitelist, because volatility "
f"over {self._days} {plural(self._days, 'day')} "
f"is: {volatility_avg:.3f} "
f"which is not in the configured range of "
f"{self._min_volatility}-{self._max_volatility}.",
logger.info)
result = False
self._pair_cache[pair] = result
return volatility_avg
else:
return None
def _validate_pair_loc(self, pair: str, volatility_avg: float) -> bool:
"""
Validate trading range
:param pair: Pair that's currently validated
:param volatility_avg: Average volatility
:return: True if the pair can stay, false if it should be removed
"""
if self._min_volatility <= volatility_avg <= self._max_volatility:
result = True
else:
self.log_once(f"Removed {pair} from whitelist, because volatility "
f"over {self._days} {plural(self._days, 'day')} "
f"is: {volatility_avg:.3f} "
f"which is not in the configured range of "
f"{self._min_volatility}-{self._max_volatility}.",
logger.info)
result = False
return result

View File

@@ -2,7 +2,6 @@
Rate of change pairlist filter
"""
import logging
from copy import deepcopy
from datetime import timedelta
from typing import Any, Dict, List, Optional
@@ -32,6 +31,7 @@ class RangeStabilityFilter(IPairList):
self._max_rate_of_change = pairlistconfig.get('max_rate_of_change')
self._refresh_period = pairlistconfig.get('refresh_period', 86400)
self._def_candletype = self._config['candle_type_def']
self._sort_direction: Optional[str] = pairlistconfig.get('sort_direction', None)
self._pair_cache: TTLCache = TTLCache(maxsize=1000, ttl=self._refresh_period)
@@ -41,6 +41,9 @@ class RangeStabilityFilter(IPairList):
if self._days > candle_limit:
raise OperationalException("RangeStabilityFilter requires lookback_days to not "
f"exceed exchange max request size ({candle_limit})")
if self._sort_direction not in [None, 'asc', 'desc']:
raise OperationalException("RangeStabilityFilter requires sort_direction to be "
"either None (undefined), 'asc' or 'desc'")
@property
def needstickers(self) -> bool:
@@ -87,6 +90,13 @@ class RangeStabilityFilter(IPairList):
"description": "Maximum Rate of Change",
"help": "Maximum rate of change to filter pairs.",
},
"sort_direction": {
"type": "option",
"default": None,
"options": ["", "asc", "desc"],
"description": "Sort pairlist",
"help": "Sort Pairlist ascending or descending by rate of change.",
},
**IPairList.refresh_period_parameter()
}
@@ -103,45 +113,62 @@ class RangeStabilityFilter(IPairList):
since_ms = dt_ts(dt_floor_day(dt_now()) - timedelta(days=self._days + 1))
candles = self._exchange.refresh_ohlcv_with_cache(needed_pairs, since_ms=since_ms)
if self._enabled:
for p in deepcopy(pairlist):
daily_candles = candles[(p, '1d', self._def_candletype)] if (
p, '1d', self._def_candletype) in candles else None
if not self._validate_pair_loc(p, daily_candles):
pairlist.remove(p)
return pairlist
resulting_pairlist: List[str] = []
pct_changes: Dict[str, float] = {}
def _validate_pair_loc(self, pair: str, daily_candles: Optional[DataFrame]) -> bool:
"""
Validate trading range
:param pair: Pair that's currently validated
:param daily_candles: Downloaded daily candles
:return: True if the pair can stay, false if it should be removed
"""
for p in pairlist:
daily_candles = candles.get((p, '1d', self._def_candletype), None)
pct_change = self._calculate_rate_of_change(p, daily_candles)
if pct_change is not None:
if self._validate_pair_loc(p, pct_change):
resulting_pairlist.append(p)
pct_changes[p] = pct_change
else:
self.log_once(f"Removed {p} from whitelist, no candles found.", logger.info)
if self._sort_direction:
resulting_pairlist = sorted(resulting_pairlist,
key=lambda p: pct_changes[p],
reverse=self._sort_direction == 'desc')
return resulting_pairlist
def _calculate_rate_of_change(self, pair: str, daily_candles: DataFrame) -> Optional[float]:
# Check symbol in cache
if (cached_res := self._pair_cache.get(pair, None)) is not None:
return cached_res
result = True
if (pct_change := self._pair_cache.get(pair, None)) is not None:
return pct_change
if daily_candles is not None and not daily_candles.empty:
highest_high = daily_candles['high'].max()
lowest_low = daily_candles['low'].min()
pct_change = ((highest_high - lowest_low) / lowest_low) if lowest_low > 0 else 0
if pct_change < self._min_rate_of_change:
self.log_once(f"Removed {pair} from whitelist, because rate of change "
f"over {self._days} {plural(self._days, 'day')} is {pct_change:.3f}, "
f"which is below the threshold of {self._min_rate_of_change}.",
logger.info)
result = False
if self._max_rate_of_change:
if pct_change > self._max_rate_of_change:
self.log_once(
f"Removed {pair} from whitelist, because rate of change "
f"over {self._days} {plural(self._days, 'day')} is {pct_change:.3f}, "
f"which is above the threshold of {self._max_rate_of_change}.",
logger.info)
result = False
self._pair_cache[pair] = result
self._pair_cache[pair] = pct_change
return pct_change
else:
self.log_once(f"Removed {pair} from whitelist, no candles found.", logger.info)
return None
def _validate_pair_loc(self, pair: str, pct_change: float) -> bool:
"""
Validate trading range
:param pair: Pair that's currently validated
:param pct_change: Rate of change
:return: True if the pair can stay, false if it should be removed
"""
result = True
if pct_change < self._min_rate_of_change:
self.log_once(f"Removed {pair} from whitelist, because rate of change "
f"over {self._days} {plural(self._days, 'day')} is {pct_change:.3f}, "
f"which is below the threshold of {self._min_rate_of_change}.",
logger.info)
result = False
if self._max_rate_of_change:
if pct_change > self._max_rate_of_change:
self.log_once(
f"Removed {pair} from whitelist, because rate of change "
f"over {self._days} {plural(self._days, 'day')} is {pct_change:.3f}, "
f"which is above the threshold of {self._max_rate_of_change}.",
logger.info)
result = False
return result

View File

@@ -1155,7 +1155,7 @@ class RPC:
}
if has_content:
dataframe.loc[:, '__date_ts'] = dataframe.loc[:, 'date'].view(int64) // 1000 // 1000
dataframe.loc[:, '__date_ts'] = dataframe.loc[:, 'date'].astype(int64) // 1000 // 1000
# Move signal close to separate column when signal for easy plotting
for sig_type in signals.keys():
if sig_type in dataframe.columns:

View File

@@ -7,10 +7,10 @@
-r docs/requirements-docs.txt
coveralls==3.3.1
ruff==0.2.2
ruff==0.3.0
mypy==1.8.0
pre-commit==3.6.2
pytest==8.0.2
pytest==8.1.0
pytest-asyncio==0.23.5
pytest-cov==4.1.0
pytest-mock==3.12.0
@@ -18,7 +18,7 @@ pytest-random-order==1.1.1
pytest-xdist==3.5.0
isort==5.13.2
# For datetime mocking
time-machine==2.13.0
time-machine==2.14.0
# Convert jupyter notebooks to markdown documents
nbconvert==7.16.1

View File

@@ -1,8 +1,8 @@
numpy==1.26.4
pandas==2.1.4
pandas==2.2.1
pandas-ta==0.3.14b
ccxt==4.2.51
ccxt==4.2.58
cryptography==42.0.5
aiohttp==3.9.3
SQLAlchemy==2.0.27
@@ -10,7 +10,7 @@ python-telegram-bot==20.8
# can't be hard-pinned due to telegram-bot pinning httpx with ~
httpx>=0.24.1
arrow==1.3.0
cachetools==5.3.2
cachetools==5.3.3
requests==2.31.0
urllib3==2.2.1
jsonschema==4.21.1
@@ -21,14 +21,14 @@ pycoingecko==3.1.0
jinja2==3.1.3
tables==3.9.1
joblib==1.3.2
rich==13.7.0
rich==13.7.1
pyarrow==15.0.0; platform_machine != 'armv7l'
# find first, C search in arrays
py_find_1st==1.1.6
# Load ticker files 30% faster
python-rapidjson==1.14
python-rapidjson==1.16
# Properly format api responses
orjson==3.9.15
@@ -37,7 +37,7 @@ sdnotify==0.3.2
# API Server
fastapi==0.110.0
pydantic==2.6.2
pydantic==2.6.3
uvicorn==0.27.1
pyjwt==2.8.0
aiofiles==23.2.1
@@ -49,7 +49,7 @@ colorama==0.4.6
questionary==2.0.1
prompt-toolkit==3.0.36
# Extensions to datetime library
python-dateutil==2.8.2
python-dateutil==2.9.0.post0
pytz==2024.1
#Futures

View File

@@ -820,12 +820,6 @@ def test_download_data_trades(mocker):
"--trading-mode", "futures",
"--dl-trades"
]
pargs = get_args(args)
pargs['config'] = None
start_download_data(pargs)
assert dl_mock.call_args[1]['timerange'].starttype == "date"
assert dl_mock.call_count == 2
assert convert_mock.call_count == 2
def test_download_data_data_invalid(mocker):
@@ -843,10 +837,11 @@ def test_download_data_data_invalid(mocker):
start_download_data(pargs)
def test_start_convert_trades(mocker, caplog):
def test_start_convert_trades(mocker):
convert_mock = mocker.patch('freqtrade.commands.data_commands.convert_trades_to_ohlcv',
MagicMock(return_value=[]))
patch_exchange(mocker)
mocker.patch(f'{EXMS}.get_markets')
mocker.patch(f'{EXMS}.markets', PropertyMock(return_value={}))
args = [
"trades-to-ohlcv",

View File

@@ -142,8 +142,8 @@ def generate_trades_history(n_rows, start_date: Optional[datetime] = None, days=
return df
def generate_test_data(timeframe: str, size: int, start: str = '2020-07-05'):
np.random.seed(42)
def generate_test_data(timeframe: str, size: int, start: str = '2020-07-05', random_seed=42):
np.random.seed(random_seed)
base = np.random.normal(20, 2, size=size)
if timeframe == '1y':
@@ -174,10 +174,10 @@ def generate_test_data(timeframe: str, size: int, start: str = '2020-07-05'):
return df
def generate_test_data_raw(timeframe: str, size: int, start: str = '2020-07-05'):
def generate_test_data_raw(timeframe: str, size: int, start: str = '2020-07-05', random_seed=42):
""" Generates data in the ohlcv format used by ccxt """
df = generate_test_data(timeframe, size, start)
df['date'] = df.loc[:, 'date'].view(np.int64) // 1000 // 1000
df = generate_test_data(timeframe, size, start, random_seed)
df['date'] = df.loc[:, 'date'].astype(np.int64) // 1000 // 1000
return list(list(x) for x in zip(*(df[x].values.tolist() for x in df.columns)))

View File

@@ -542,7 +542,9 @@ def test_convert_trades_to_ohlcv(testdatadir, tmp_path, caplog):
convert_trades_to_ohlcv([pair], timeframes=['1m', '5m'],
data_format_trades='jsongz',
datadir=tmp_path, timerange=tr, erase=True)
datadir=tmp_path, timerange=tr, erase=True,
data_format_ohlcv='feather',
candle_type=CandleType.SPOT)
assert log_has("Deleting existing data for pair XRP/ETH, interval 1m.", caplog)
# Load new data
@@ -556,5 +558,7 @@ def test_convert_trades_to_ohlcv(testdatadir, tmp_path, caplog):
convert_trades_to_ohlcv(['NoDatapair'], timeframes=['1m', '5m'],
data_format_trades='jsongz',
datadir=tmp_path, timerange=tr, erase=True)
datadir=tmp_path, timerange=tr, erase=True,
data_format_ohlcv='feather',
candle_type=CandleType.SPOT)
assert log_has(msg, caplog)

View File

@@ -261,11 +261,11 @@ def test_datahandler_trades_not_supported(datahandler, testdatadir, ):
def test_jsondatahandler_trades_load(testdatadir, caplog):
dh = JsonGzDataHandler(testdatadir)
logmsg = "Old trades format detected - converting"
dh.trades_load('XRP/ETH')
dh.trades_load('XRP/ETH', TradingMode.SPOT)
assert not log_has(logmsg, caplog)
# Test conversation is happening
dh.trades_load('XRP/OLD')
dh.trades_load('XRP/OLD', TradingMode.SPOT)
assert log_has(logmsg, caplog)
@@ -300,16 +300,16 @@ def test_datahandler_trades_get_pairs(testdatadir, datahandler, expected):
def test_hdf5datahandler_trades_load(testdatadir):
dh = get_datahandler(testdatadir, 'hdf5')
trades = dh.trades_load('XRP/ETH')
trades = dh.trades_load('XRP/ETH', TradingMode.SPOT)
assert isinstance(trades, DataFrame)
trades1 = dh.trades_load('UNITTEST/NONEXIST')
trades1 = dh.trades_load('UNITTEST/NONEXIST', TradingMode.SPOT)
assert isinstance(trades1, DataFrame)
assert trades1.empty
# data goes from 2019-10-11 - 2019-10-13
timerange = TimeRange.parse_timerange('20191011-20191012')
trades2 = dh._trades_load('XRP/ETH', timerange)
trades2 = dh._trades_load('XRP/ETH', TradingMode.SPOT, timerange)
assert len(trades) > len(trades2)
# Check that ID is None (If it's nan, it's wrong)
assert trades2.iloc[0]['type'] is None
@@ -451,13 +451,13 @@ def test_hdf5datahandler_ohlcv_purge(mocker, testdatadir):
@pytest.mark.parametrize('datahandler', ['jsongz', 'hdf5', 'feather', 'parquet'])
def test_datahandler_trades_load(testdatadir, datahandler):
dh = get_datahandler(testdatadir, datahandler)
trades = dh.trades_load('XRP/ETH')
trades = dh.trades_load('XRP/ETH', TradingMode.SPOT)
assert isinstance(trades, DataFrame)
assert trades.iloc[0]['timestamp'] == 1570752011620
assert trades.iloc[0]['date'] == Timestamp('2019-10-11 00:00:11.620000+0000')
assert trades.iloc[-1]['cost'] == 0.1986231
trades1 = dh.trades_load('UNITTEST/NONEXIST')
trades1 = dh.trades_load('UNITTEST/NONEXIST', TradingMode.SPOT)
assert isinstance(trades, DataFrame)
assert trades1.empty
@@ -465,15 +465,15 @@ def test_datahandler_trades_load(testdatadir, datahandler):
@pytest.mark.parametrize('datahandler', ['jsongz', 'hdf5', 'feather', 'parquet'])
def test_datahandler_trades_store(testdatadir, tmp_path, datahandler):
dh = get_datahandler(testdatadir, datahandler)
trades = dh.trades_load('XRP/ETH')
trades = dh.trades_load('XRP/ETH', TradingMode.SPOT)
dh1 = get_datahandler(tmp_path, datahandler)
dh1.trades_store('XRP/NEW', trades)
dh1.trades_store('XRP/NEW', trades, TradingMode.SPOT)
file = tmp_path / f'XRP_NEW-trades.{dh1._get_file_extension()}'
assert file.is_file()
# Load trades back
trades_new = dh1.trades_load('XRP/NEW')
trades_new = dh1.trades_load('XRP/NEW', TradingMode.SPOT)
assert_frame_equal(trades, trades_new, check_exact=True)
assert len(trades_new) == len(trades)
@@ -483,11 +483,11 @@ def test_datahandler_trades_purge(mocker, testdatadir, datahandler):
mocker.patch.object(Path, "exists", MagicMock(return_value=False))
unlinkmock = mocker.patch.object(Path, "unlink", MagicMock())
dh = get_datahandler(testdatadir, datahandler)
assert not dh.trades_purge('UNITTEST/NONEXIST')
assert not dh.trades_purge('UNITTEST/NONEXIST', TradingMode.SPOT)
assert unlinkmock.call_count == 0
mocker.patch.object(Path, "exists", MagicMock(return_value=True))
assert dh.trades_purge('UNITTEST/NONEXIST')
assert dh.trades_purge('UNITTEST/NONEXIST', TradingMode.SPOT)
assert unlinkmock.call_count == 1

View File

@@ -78,11 +78,6 @@ def test_download_data_main_trades(mocker):
"trading_mode": "futures",
})
download_data_main(config)
assert dl_mock.call_args[1]['timerange'].starttype == "date"
assert dl_mock.call_count == 2
assert convert_mock.call_count == 2
def test_download_data_main_data_invalid(mocker):
patch_exchange(mocker, id="kraken")

View File

@@ -23,7 +23,7 @@ from freqtrade.data.history.history_utils import (_download_pair_history, _downl
validate_backtest_data)
from freqtrade.data.history.idatahandler import get_datahandler
from freqtrade.data.history.jsondatahandler import JsonDataHandler, JsonGzDataHandler
from freqtrade.enums import CandleType
from freqtrade.enums import CandleType, TradingMode
from freqtrade.exchange import timeframe_to_minutes
from freqtrade.misc import file_dump_json
from freqtrade.resolvers import StrategyResolver
@@ -168,20 +168,21 @@ def test_json_pair_data_filename(pair, timeframe, expected_result, candle_type):
assert fn == Path(expected_result + '.gz')
@pytest.mark.parametrize("pair,expected_result", [
("ETH/BTC", 'freqtrade/hello/world/ETH_BTC-trades.json'),
("Fabric Token/ETH", 'freqtrade/hello/world/Fabric_Token_ETH-trades.json'),
("ETHH20", 'freqtrade/hello/world/ETHH20-trades.json'),
(".XBTBON2H", 'freqtrade/hello/world/_XBTBON2H-trades.json'),
("ETHUSD.d", 'freqtrade/hello/world/ETHUSD_d-trades.json'),
("ACC_OLD_BTC", 'freqtrade/hello/world/ACC_OLD_BTC-trades.json'),
@pytest.mark.parametrize("pair,trading_mode,expected_result", [
("ETH/BTC", '', 'freqtrade/hello/world/ETH_BTC-trades.json'),
("ETH/USDT:USDT", 'futures', 'freqtrade/hello/world/futures/ETH_USDT_USDT-trades.json'),
("Fabric Token/ETH", '', 'freqtrade/hello/world/Fabric_Token_ETH-trades.json'),
("ETHH20", '', 'freqtrade/hello/world/ETHH20-trades.json'),
(".XBTBON2H", '', 'freqtrade/hello/world/_XBTBON2H-trades.json'),
("ETHUSD.d", '', 'freqtrade/hello/world/ETHUSD_d-trades.json'),
("ACC_OLD_BTC", '', 'freqtrade/hello/world/ACC_OLD_BTC-trades.json'),
])
def test_json_pair_trades_filename(pair, expected_result):
fn = JsonDataHandler._pair_trades_filename(Path('freqtrade/hello/world'), pair)
def test_json_pair_trades_filename(pair, trading_mode, expected_result):
fn = JsonDataHandler._pair_trades_filename(Path('freqtrade/hello/world'), pair, trading_mode)
assert isinstance(fn, Path)
assert fn == Path(expected_result)
fn = JsonGzDataHandler._pair_trades_filename(Path('freqtrade/hello/world'), pair)
fn = JsonGzDataHandler._pair_trades_filename(Path('freqtrade/hello/world'), pair, trading_mode)
assert isinstance(fn, Path)
assert fn == Path(expected_result + '.gz')
@@ -559,7 +560,8 @@ def test_refresh_backtest_trades_data(mocker, default_conf, markets, caplog, tes
unavailable_pairs = refresh_backtest_trades_data(exchange=ex,
pairs=["ETH/BTC", "XRP/BTC", "XRP/ETH"],
datadir=testdatadir,
timerange=timerange, erase=True
timerange=timerange, erase=True,
trading_mode=TradingMode.SPOT,
)
assert dl_mock.call_count == 2
@@ -584,7 +586,7 @@ def test_download_trades_history(trades_history, mocker, default_conf, testdatad
assert not file1.is_file()
assert _download_trades_history(data_handler=data_handler, exchange=exchange,
pair='ETH/BTC')
pair='ETH/BTC', trading_mode=TradingMode.SPOT)
assert log_has("Current Amount of trades: 0", caplog)
assert log_has("New Amount of trades: 6", caplog)
assert ght_mock.call_count == 1
@@ -597,8 +599,9 @@ def test_download_trades_history(trades_history, mocker, default_conf, testdatad
since_time = int(trades_history[-3][0] // 1000)
since_time2 = int(trades_history[-1][0] // 1000)
timerange = TimeRange('date', None, since_time, 0)
assert _download_trades_history(data_handler=data_handler, exchange=exchange,
pair='ETH/BTC', timerange=timerange)
assert _download_trades_history(
data_handler=data_handler, exchange=exchange, pair='ETH/BTC',
timerange=timerange, trading_mode=TradingMode.SPOT)
assert ght_mock.call_count == 1
# Check this in seconds - since we had to convert to seconds above too.
@@ -611,7 +614,7 @@ def test_download_trades_history(trades_history, mocker, default_conf, testdatad
caplog.clear()
assert not _download_trades_history(data_handler=data_handler, exchange=exchange,
pair='ETH/BTC')
pair='ETH/BTC', trading_mode=TradingMode.SPOT)
assert log_has_re('Failed to download and store historic trades for pair: "ETH/BTC".*', caplog)
file2 = tmp_path / 'XRP_ETH-trades.json.gz'
@@ -623,8 +626,9 @@ def test_download_trades_history(trades_history, mocker, default_conf, testdatad
since_time = int(trades_history[0][0] // 1000) - 500
timerange = TimeRange('date', None, since_time, 0)
assert _download_trades_history(data_handler=data_handler, exchange=exchange,
pair='XRP/ETH', timerange=timerange)
assert _download_trades_history(
data_handler=data_handler, exchange=exchange, pair='XRP/ETH',
timerange=timerange, trading_mode=TradingMode.SPOT)
assert ght_mock.call_count == 1

View File

@@ -6,6 +6,7 @@ import pytest
from freqtrade.data.converter.trade_converter_kraken import import_kraken_trades_from_csv
from freqtrade.data.history.idatahandler import get_datahandler
from freqtrade.enums import TradingMode
from freqtrade.exceptions import OperationalException
from tests.conftest import EXMS, log_has, log_has_re, patch_exchange
@@ -40,7 +41,7 @@ def test_import_kraken_trades_from_csv(testdatadir, tmp_path, caplog, default_co
assert dstfile.is_file()
dh = get_datahandler(tmp_path, 'feather')
trades = dh.trades_load('BCH_EUR')
trades = dh.trades_load('BCH_EUR', TradingMode.SPOT)
assert len(trades) == 340
assert trades['date'].min().to_pydatetime() == datetime(2023, 1, 1, 0, 3, 56,

View File

@@ -7,6 +7,7 @@ from unittest.mock import MagicMock, Mock, PropertyMock, patch
import ccxt
import pytest
from numpy import NaN
from pandas import DataFrame
from freqtrade.enums import CandleType, MarginMode, RunMode, TradingMode
@@ -4203,6 +4204,7 @@ def test_get_max_leverage_from_margin(default_conf, mocker, pair, nominal_value,
(10, 0.0001, 2.0, 1.0, 0.002, 0.002),
(10, 0.0002, 2.0, 0.01, 0.004, 0.00004),
(10, 0.0002, 2.5, None, 0.005, None),
(10, 0.0002, NaN, None, 0.0, None),
])
def test_calculate_funding_fees(
default_conf,
@@ -4312,8 +4314,8 @@ def test_combine_funding_and_mark(
assert len(df) == 1
# Empty funding rates
funding_rates = DataFrame([], columns=['date', 'open'])
df = exchange.combine_funding_and_mark(funding_rates, mark_rates, futures_funding_rate)
funding_rates2 = DataFrame([], columns=['date', 'open'])
df = exchange.combine_funding_and_mark(funding_rates2, mark_rates, futures_funding_rate)
if futures_funding_rate is not None:
assert len(df) == 3
assert df.iloc[0]['open_fund'] == futures_funding_rate
@@ -4322,6 +4324,12 @@ def test_combine_funding_and_mark(
else:
assert len(df) == 0
# Empty mark candles
mark_candles = DataFrame([], columns=['date', 'open'])
df = exchange.combine_funding_and_mark(funding_rates, mark_candles, futures_funding_rate)
assert len(df) == 0
@pytest.mark.parametrize('exchange,rate_start,rate_end,d1,d2,amount,expected_fees', [
('binance', 0, 2, "2021-09-01 01:00:00", "2021-09-01 04:00:00", 30.0, 0.0),

View File

@@ -57,28 +57,30 @@ def test_backtest_position_adjustment(default_conf, fee, mocker, testdatadir) ->
),
'close_date': pd.to_datetime([dt_utc(2018, 1, 29, 22, 00, 0),
dt_utc(2018, 1, 30, 4, 10, 0)], utc=True),
'open_rate': [0.10401764894444211, 0.10302485],
'close_rate': [0.10453904066847439, 0.103541],
'open_rate': [0.10401764891917063, 0.10302485],
'close_rate': [0.10453904064307624, 0.10354126528822055],
'fee_open': [0.0025, 0.0025],
'fee_close': [0.0025, 0.0025],
'trade_duration': [200, 40],
'profit_ratio': [0.0, 0.0],
'profit_abs': [0.0, 0.0],
'exit_reason': [ExitType.ROI.value, ExitType.ROI.value],
'initial_stop_loss_abs': [0.0940005, 0.09272236],
'initial_stop_loss_abs': [0.0940005, 0.092722365],
'initial_stop_loss_ratio': [-0.1, -0.1],
'stop_loss_abs': [0.0940005, 0.09272236],
'stop_loss_abs': [0.0940005, 0.092722365],
'stop_loss_ratio': [-0.1, -0.1],
'min_rate': [0.10370188, 0.10300000000000001],
'max_rate': [0.10481985, 0.1038888],
'max_rate': [0.10481985, 0.10388887000000001],
'is_open': [False, False],
'enter_tag': ['', ''],
'leverage': [1.0, 1.0],
'is_short': [False, False],
'open_timestamp': [1517251200000, 1517283000000],
'close_timestamp': [1517265300000, 1517285400000],
'close_timestamp': [1517263200000, 1517285400000],
})
pd.testing.assert_frame_equal(results.drop(columns=['orders']), expected)
results_no = results.drop(columns=['orders'])
pd.testing.assert_frame_equal(results_no, expected, check_exact=True)
data_pair = processed[pair]
assert len(results.iloc[0]['orders']) == 6
assert len(results.iloc[1]['orders']) == 2

View File

@@ -498,7 +498,7 @@ def test__get_resample_from_period():
assert _get_resample_from_period('day') == '1d'
assert _get_resample_from_period('week') == '1W-MON'
assert _get_resample_from_period('month') == '1M'
assert _get_resample_from_period('month') == '1ME'
with pytest.raises(ValueError, match=r"Period noooo is not supported."):
_get_resample_from_period('noooo')

View File

@@ -19,7 +19,7 @@ from freqtrade.plugins.pairlist.pairlist_helpers import dynamic_expand_pairlist,
from freqtrade.plugins.pairlistmanager import PairListManager
from freqtrade.resolvers import PairListResolver
from freqtrade.util.datetime_helpers import dt_now
from tests.conftest import (EXMS, create_mock_trades_usdt, get_patched_exchange,
from tests.conftest import (EXMS, create_mock_trades_usdt, generate_test_data, get_patched_exchange,
get_patched_freqtradebot, log_has, log_has_re, num_log_has)
@@ -748,6 +748,104 @@ def test_PerformanceFilter_error(mocker, whitelist_conf, caplog) -> None:
assert log_has("PerformanceFilter is not available in this mode.", caplog)
def test_VolatilityFilter_error(mocker, whitelist_conf) -> None:
volatility_filter = {"method": "VolatilityFilter", "lookback_days": -1}
whitelist_conf['pairlists'] = [{"method": "StaticPairList"}, volatility_filter]
mocker.patch(f'{EXMS}.exchange_has', MagicMock(return_value=True))
exchange_mock = MagicMock()
exchange_mock.ohlcv_candle_limit = MagicMock(return_value=1000)
with pytest.raises(OperationalException,
match=r"VolatilityFilter requires lookback_days to be >= 1*"):
PairListManager(exchange_mock, whitelist_conf, MagicMock())
volatility_filter = {"method": "VolatilityFilter", "lookback_days": 2000}
whitelist_conf['pairlists'] = [{"method": "StaticPairList"}, volatility_filter]
with pytest.raises(OperationalException,
match=r"VolatilityFilter requires lookback_days to not exceed exchange max"):
PairListManager(exchange_mock, whitelist_conf, MagicMock())
volatility_filter = {"method": "VolatilityFilter", "sort_direction": "Random"}
whitelist_conf['pairlists'] = [{"method": "StaticPairList"}, volatility_filter]
with pytest.raises(OperationalException,
match=r"VolatilityFilter requires sort_direction to be either "
r"None .*'asc'.*'desc'"):
PairListManager(exchange_mock, whitelist_conf, MagicMock())
@pytest.mark.parametrize('pairlist,expected_pairlist', [
({"method": "VolatilityFilter", "sort_direction": "asc"},
['XRP/BTC', 'ETH/BTC', 'LTC/BTC', 'TKN/BTC']),
({"method": "VolatilityFilter", "sort_direction": "desc"},
['TKN/BTC', 'LTC/BTC', 'ETH/BTC', 'XRP/BTC']),
({"method": "VolatilityFilter", "sort_direction": "desc", 'min_volatility': 0.4},
['TKN/BTC', 'LTC/BTC', 'ETH/BTC']),
({"method": "VolatilityFilter", "sort_direction": "asc", 'min_volatility': 0.4},
['ETH/BTC', 'LTC/BTC', 'TKN/BTC']),
({"method": "VolatilityFilter", "sort_direction": "desc", 'max_volatility': 0.5},
['LTC/BTC', 'ETH/BTC', 'XRP/BTC']),
({"method": "VolatilityFilter", "sort_direction": "asc", 'max_volatility': 0.5},
['XRP/BTC', 'ETH/BTC', 'LTC/BTC']),
({"method": "RangeStabilityFilter", "sort_direction": "asc"},
['ETH/BTC', 'XRP/BTC', 'LTC/BTC', 'TKN/BTC']),
({"method": "RangeStabilityFilter", "sort_direction": "desc"},
['TKN/BTC', 'LTC/BTC', 'XRP/BTC', 'ETH/BTC']),
({"method": "RangeStabilityFilter", "sort_direction": "asc", 'min_rate_of_change': 0.4},
['XRP/BTC', 'LTC/BTC', 'TKN/BTC']),
({"method": "RangeStabilityFilter", "sort_direction": "desc", 'min_rate_of_change': 0.4},
['TKN/BTC', 'LTC/BTC', 'XRP/BTC']),
])
def test_VolatilityFilter_RangeStabilityFilter_sort(
mocker, whitelist_conf, tickers, time_machine, pairlist, expected_pairlist) -> None:
whitelist_conf['pairlists'] = [
{'method': 'VolumePairList', 'number_assets': 10},
pairlist
]
df1 = generate_test_data('1d', 10, '2022-01-05 00:00:00+00:00', random_seed=42)
df2 = generate_test_data('1d', 10, '2022-01-05 00:00:00+00:00', random_seed=2)
df3 = generate_test_data('1d', 10, '2022-01-05 00:00:00+00:00', random_seed=3)
df4 = generate_test_data('1d', 10, '2022-01-05 00:00:00+00:00', random_seed=4)
df5 = generate_test_data('1d', 10, '2022-01-05 00:00:00+00:00', random_seed=5)
df6 = generate_test_data('1d', 10, '2022-01-05 00:00:00+00:00', random_seed=6)
assert not df1.equals(df2)
time_machine.move_to('2022-01-15 00:00:00+00:00')
ohlcv_data = {
('ETH/BTC', '1d', CandleType.SPOT): df1,
('TKN/BTC', '1d', CandleType.SPOT): df2,
('LTC/BTC', '1d', CandleType.SPOT): df3,
('XRP/BTC', '1d', CandleType.SPOT): df4,
('HOT/BTC', '1d', CandleType.SPOT): df5,
('BLK/BTC', '1d', CandleType.SPOT): df6,
}
ohlcv_mock = MagicMock(return_value=ohlcv_data)
mocker.patch.multiple(
EXMS,
exchange_has=MagicMock(return_value=True),
refresh_latest_ohlcv=ohlcv_mock,
get_tickers=tickers
)
exchange = get_patched_exchange(mocker, whitelist_conf)
exchange.ohlcv_candle_limit = MagicMock(return_value=1000)
plm = PairListManager(exchange, whitelist_conf, MagicMock())
assert exchange.ohlcv_candle_limit.call_count == 2
plm.refresh_pairlist()
assert ohlcv_mock.call_count == 1
assert exchange.ohlcv_candle_limit.call_count == 2
assert plm.whitelist == expected_pairlist
plm.refresh_pairlist()
assert exchange.ohlcv_candle_limit.call_count == 2
assert ohlcv_mock.call_count == 1
def test_ShuffleFilter_init(mocker, whitelist_conf, caplog) -> None:
whitelist_conf['pairlists'] = [
{"method": "StaticPairList"},
@@ -1095,6 +1193,13 @@ def test_rangestabilityfilter_checks(mocker, default_conf, markets, tickers):
match='RangeStabilityFilter requires lookback_days to be >= 1'):
get_patched_freqtradebot(mocker, default_conf)
default_conf['pairlists'] = [{'method': 'VolumePairList', 'number_assets': 10},
{'method': 'RangeStabilityFilter', 'sort_direction': 'something'}]
with pytest.raises(OperationalException,
match='RangeStabilityFilter requires sort_direction to be either None.*'):
get_patched_freqtradebot(mocker, default_conf)
@pytest.mark.parametrize('min_rate_of_change,max_rate_of_change,expected_length', [
(0.01, 0.99, 5),

View File

@@ -1022,22 +1022,22 @@ def test_auto_hyperopt_interface_loadparams(default_conf, mocker, caplog):
@pytest.mark.parametrize('function,raises', [
('populate_entry_trend', True),
('populate_entry_trend', False),
('advise_entry', False),
('populate_exit_trend', True),
('populate_exit_trend', False),
('advise_exit', False),
])
def test_pandas_warning_direct(ohlcv_history, function, raises):
def test_pandas_warning_direct(ohlcv_history, function, raises, recwarn):
df = _STRATEGY.populate_indicators(ohlcv_history, {'pair': 'ETH/BTC'})
if raises:
with pytest.warns(FutureWarning):
# Test for Future warning
# FutureWarning: Setting an item of incompatible dtype is
# deprecated and will raise in a future error of pandas
# https://github.com/pandas-dev/pandas/issues/56503
getattr(_STRATEGY, function)(df, {'pair': 'ETH/BTC'})
assert len(recwarn) == 1
# https://github.com/pandas-dev/pandas/issues/56503
# Fixed in 2.2.x
getattr(_STRATEGY, function)(df, {'pair': 'ETH/BTC'})
else:
assert len(recwarn) == 0
getattr(_STRATEGY, function)(df, {'pair': 'ETH/BTC'})