From: Jérôme Benoit Date: Sun, 4 Jan 2026 23:02:21 +0000 (+0100) Subject: feat(quickadapter): add combined extrema weighting strategy with multi-metric aggregation X-Git-Url: https://git.piment-noir.org/?a=commitdiff_plain;h=292783a5f9179f129fa464fca8958c9df547c16d;p=freqai-strategies.git feat(quickadapter): add combined extrema weighting strategy with multi-metric aggregation Add new 'combined' strategy to extrema weighting that aggregates multiple metrics (amplitude, amplitude_threshold_ratio, volume_rate, speed, efficiency_ratio, volume_weighted_efficiency_ratio) using configurable coefficients and aggregation methods. Features: - New strategy type 'combined' with per-metric coefficient weighting - Support for weighted_average and geometric_mean aggregation methods - Normalize all metrics to [0,1] range for consistent aggregation: * amplitude: x/(1+x) * amplitude_threshold_ratio: x/(x+median) * volume_rate: x/(x+median) * speed: x/(1+x) - Deterministic metric iteration order via COMBINED_METRICS constant - Centralized validation in get_extrema_weighting_config() - Comprehensive logging of new parameters Configuration: - metric_coefficients: dict mapping metric names to positive weights - aggregation: 'weighted_average' (default) or 'geometric_mean' - Empty coefficients dict defaults to equal weights (1.0) for all metrics Documentation: - README updated with new strategy and parameters - Mathematical formulas for aggregation methods - Style aligned with existing documentation conventions Bump version: 3.10.1 -> 3.10.2 --- diff --git a/README.md b/README.md index e29e0dc..c6fb8c5 100644 --- a/README.md +++ b/README.md @@ -37,85 +37,87 @@ docker compose up -d --build ### Configuration tunables -| Path | Default | Type / Range | Description | -| -------------------------------------------------------------- | ----------------------------- | --------------------------------------------------------------------------------------------------------------------------------- | ----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | -| _Protections_ | | | | -| custom_protections.trade_duration_candles | 72 | int >= 1 | Estimated trade duration in candles. Scales protections stop duration candles and trade limit. | -| custom_protections.lookback_period_fraction | 0.5 | float (0,1] | Fraction of `fit_live_predictions_candles` used to calculate `lookback_period_candles` for _MaxDrawdown_ and _StoplossGuard_ protections. | -| custom_protections.cooldown.enabled | true | bool | Enable/disable _CooldownPeriod_ protection. | -| custom_protections.cooldown.stop_duration_candles | 4 | int >= 1 | Number of candles to wait before allowing new trades after a trade is closed. | -| custom_protections.drawdown.enabled | true | bool | Enable/disable _MaxDrawdown_ protection. | -| custom_protections.drawdown.max_allowed_drawdown | 0.2 | float (0,1) | Maximum allowed drawdown. | -| custom_protections.stoploss.enabled | true | bool | Enable/disable _StoplossGuard_ protection. | -| _Leverage_ | | | | -| leverage | `proposed_leverage` | float [1.0, max_leverage] | Leverage. Fallback to `proposed_leverage` for the pair. | -| _Exit pricing_ | | | | -| exit_pricing.trade_price_target_method | `moving_average` | enum {`moving_average`,`quantile_interpolation`,`weighted_average`} | Trade NATR computation method. (Deprecated alias: `exit_pricing.trade_price_target`) | -| exit_pricing.thresholds_calibration.decline_quantile | 0.75 | float (0,1) | PnL decline quantile threshold. | -| _Reversal confirmation_ | | | | -| reversal_confirmation.lookback_period_candles | 0 | int >= 0 | Prior confirming candles; 0 = none. (Deprecated alias: `reversal_confirmation.lookback_period`) | -| reversal_confirmation.decay_fraction | 0.5 | float (0,1] | Geometric per-candle volatility adjusted reversal threshold relaxation factor. (Deprecated alias: `reversal_confirmation.decay_ratio`) | -| reversal_confirmation.min_natr_multiplier_fraction | 0.0095 | float [0,1] | Lower bound fraction for volatility adjusted reversal threshold. (Deprecated alias: `reversal_confirmation.min_natr_ratio_percent`) | -| reversal_confirmation.max_natr_multiplier_fraction | 0.075 | float [0,1] | Upper bound fraction (>= lower bound) for volatility adjusted reversal threshold. (Deprecated alias: `reversal_confirmation.max_natr_ratio_percent`) | -| _Regressor model_ | | | | -| freqai.regressor | `xgboost` | enum {`xgboost`,`lightgbm`,`histgradientboostingregressor`} | Machine learning regressor algorithm. | -| _Extrema smoothing_ | | | | -| freqai.extrema_smoothing.method | `gaussian` | enum {`gaussian`,`kaiser`,`triang`,`smm`,`sma`,`savgol`,`gaussian_filter1d`} | Extrema smoothing method (`smm`=median, `sma`=mean, `savgol`=Savitzky–Golay). | -| freqai.extrema_smoothing.window_candles | 5 | int >= 3 | Smoothing window length (candles). (Deprecated alias: `freqai.extrema_smoothing.window`) | -| freqai.extrema_smoothing.beta | 8.0 | float > 0 | Shape parameter for `kaiser` kernel. | -| freqai.extrema_smoothing.polyorder | 3 | int >= 1 | Polynomial order for `savgol` smoothing. | -| freqai.extrema_smoothing.mode | `mirror` | enum {`mirror`,`constant`,`nearest`,`wrap`,`interp`} | Boundary mode for `savgol` and `gaussian_filter1d`. | -| freqai.extrema_smoothing.sigma | 1.0 | float > 0 | Gaussian `sigma` for `gaussian_filter1d` smoothing. | -| _Extrema weighting_ | | | | -| freqai.extrema_weighting.strategy | `none` | enum {`none`,`amplitude`,`amplitude_threshold_ratio`,`volume_rate`,`speed`,`efficiency_ratio`,`volume_weighted_efficiency_ratio`} | Extrema weighting source: unweighted (`none`), swing amplitude (`amplitude`), swing amplitude / median volatility-threshold ratio (`amplitude_threshold_ratio`), swing volume per candle (`volume_rate`), swing speed (`speed`), swing efficiency ratio (`efficiency_ratio`), or swing volume-weighted efficiency ratio (`volume_weighted_efficiency_ratio`). | -| freqai.extrema_weighting.standardization | `none` | enum {`none`,`zscore`,`robust`,`mmad`,`power_yj`} | Standardization method applied to smoothed weighted extrema before normalization. `none`=w, `zscore`=(w-μ)/σ, `robust`=(w-median)/IQR, `mmad`=(w-median)/(MAD·k), `power_yj`=YJ(w). | -| freqai.extrema_weighting.robust_quantiles | [0.25, 0.75] | list[float] where 0 <= Q1 < Q3 <= 1 | Quantile range for robust standardization, Q1 and Q3. | -| freqai.extrema_weighting.mmad_scaling_factor | 1.4826 | float > 0 | Scaling factor for MMAD standardization. | -| freqai.extrema_weighting.normalization | `maxabs` | enum {`maxabs`,`minmax`,`sigmoid`,`none`} | Normalization method applied to smoothed weighted extrema. `maxabs`=w/max(\|w\|), `minmax`=low+(w-min)/(max-min)·(high-low), `sigmoid`=2·σ(scale·w)-1, `none`=w. | -| freqai.extrema_weighting.minmax_range | [-1.0, 1.0] | list[float] | Target range for `minmax` normalization, min and max. | -| freqai.extrema_weighting.sigmoid_scale | 1.0 | float > 0 | Scale parameter for `sigmoid` normalization, controls steepness. | -| freqai.extrema_weighting.gamma | 1.0 | float (0,10] | Contrast exponent applied to smoothed weighted extrema after normalization: >1 emphasizes extrema, values between 0 and 1 soften. | -| _Feature parameters_ | | | | -| freqai.feature_parameters.label_period_candles | min/max midpoint | int >= 1 | Zigzag labeling NATR horizon. | -| freqai.feature_parameters.min_label_period_candles | 12 | int >= 1 | Minimum labeling NATR horizon used for reversals labeling HPO. | -| freqai.feature_parameters.max_label_period_candles | 24 | int >= 1 | Maximum labeling NATR horizon used for reversals labeling HPO. | -| freqai.feature_parameters.label_natr_multiplier | min/max midpoint | float > 0 | Zigzag labeling NATR multiplier. (Deprecated alias: `freqai.feature_parameters.label_natr_ratio`) | -| freqai.feature_parameters.min_label_natr_multiplier | 9.0 | float > 0 | Minimum labeling NATR multiplier used for reversals labeling HPO. (Deprecated alias: `freqai.feature_parameters.min_label_natr_ratio`) | -| freqai.feature_parameters.max_label_natr_multiplier | 12.0 | float > 0 | Maximum labeling NATR multiplier used for reversals labeling HPO. (Deprecated alias: `freqai.feature_parameters.max_label_natr_ratio`) | -| freqai.feature_parameters.label_frequency_candles | `auto` | int >= 2 \| `auto` | Reversals labeling frequency. `auto` = max(2, 2 \* number of whitelisted pairs). | -| freqai.feature_parameters.label_weights | [1/7,1/7,1/7,1/7,1/7,1/7,1/7] | list[float] | Per-objective weights used in distance calculations to ideal point. Objectives: (1) number of detected reversals, (2) median swing amplitude, (3) median (swing amplitude / median volatility-threshold ratio), (4) median swing volume per candle, (5) median swing speed, (6) median swing efficiency ratio, (7) median swing volume-weighted efficiency ratio. | -| freqai.feature_parameters.label_p_order | `None` | float \| None | p-order parameter for distance metrics. Used by minkowski (default 2.0) and power_mean (default 1.0). Ignored by other metrics. | -| freqai.feature_parameters.label_method | `compromise_programming` | enum {`compromise_programming`,`topsis`,`kmeans`,`kmeans2`,`kmedoids`,`knn`,`medoid`} | HPO `label` Pareto front trial selection method. | -| freqai.feature_parameters.label_distance_metric | `euclidean` | string | Distance metric for `compromise_programming` and `topsis` methods. | -| freqai.feature_parameters.label_cluster_metric | `euclidean` | string | Distance metric for `kmeans`, `kmeans2`, and `kmedoids` methods. | -| freqai.feature_parameters.label_cluster_selection_method | `topsis` | enum {`compromise_programming`,`topsis`} | Cluster selection method for clustering-based label methods. | -| freqai.feature_parameters.label_cluster_trial_selection_method | `topsis` | enum {`compromise_programming`,`topsis`} | Best cluster trial selection method for clustering-based label methods. | -| freqai.feature_parameters.label_density_metric | method-dependent | string | Distance metric for `knn` and `medoid` methods. | -| freqai.feature_parameters.label_density_aggregation | `power_mean` | enum {`power_mean`,`quantile`,`min`,`max`} | Aggregation method for KNN neighbor distances. | -| freqai.feature_parameters.label_density_n_neighbors | 5 | int >= 1 | Number of neighbors for KNN. | -| freqai.feature_parameters.label_density_aggregation_param | aggregation-dependent | float \| None | Tunable for KNN neighbor distance aggregation: p-order (`power_mean`) or quantile value (`quantile`). | -| _Predictions extrema_ | | | | -| freqai.predictions_extrema.selection_method | `rank_extrema` | enum {`rank_extrema`,`rank_peaks`,`partition`} | Extrema selection method. `rank_extrema` ranks extrema values, `rank_peaks` ranks detected peak values, `partition` uses sign-based partitioning. | -| freqai.predictions_extrema.threshold_smoothing_method | `mean` | enum {`mean`,`isodata`,`li`,`minimum`,`otsu`,`triangle`,`yen`,`median`,`soft_extremum`} | Thresholding method for prediction thresholds smoothing. (Deprecated alias: `freqai.predictions_extrema.thresholds_smoothing`) | -| freqai.predictions_extrema.soft_extremum_alpha | 12.0 | float >= 0 | Alpha for `soft_extremum` thresholds smoothing. (Deprecated alias: `freqai.predictions_extrema.thresholds_alpha`) | -| freqai.predictions_extrema.outlier_threshold_quantile | 0.999 | float (0,1) | Quantile threshold for predictions outlier filtering. (Deprecated alias: `freqai.predictions_extrema.threshold_outlier`) | -| freqai.predictions_extrema.keep_extrema_fraction | 1.0 | float (0,1] | Fraction of extrema used for thresholds. `1.0` uses all, lower values keep only most significant. Applies to `rank_extrema` and `rank_peaks`; ignored for `partition`. (Deprecated alias: `freqai.predictions_extrema.extrema_fraction`) | -| _Optuna / HPO_ | | | | -| freqai.optuna_hyperopt.enabled | false | bool | Enables HPO. | -| freqai.optuna_hyperopt.sampler | `tpe` | enum {`tpe`,`auto`} | HPO sampler algorithm for `hp` namespace. `tpe` uses [TPESampler](https://optuna.readthedocs.io/en/stable/reference/samplers/generated/optuna.samplers.TPESampler.html) with multivariate and group, `auto` uses [AutoSampler](https://hub.optuna.org/samplers/auto_sampler). | -| freqai.optuna_hyperopt.label_sampler | `auto` | enum {`auto`,`tpe`,`nsgaii`,`nsgaiii`} | HPO sampler algorithm for multi-objective `label` namespace. `nsgaii` uses [NSGAIISampler](https://optuna.readthedocs.io/en/stable/reference/samplers/generated/optuna.samplers.NSGAIISampler.html), `nsgaiii` uses [NSGAIIISampler](https://optuna.readthedocs.io/en/stable/reference/samplers/generated/optuna.samplers.NSGAIIISampler.html). | -| freqai.optuna_hyperopt.storage | `file` | enum {`file`,`sqlite`} | HPO storage backend. | -| freqai.optuna_hyperopt.continuous | true | bool | Continuous HPO. | -| freqai.optuna_hyperopt.warm_start | true | bool | Warm start HPO with previous best value(s). | -| freqai.optuna_hyperopt.n_startup_trials | 15 | int >= 0 | HPO startup trials. | -| freqai.optuna_hyperopt.n_trials | 50 | int >= 1 | Maximum HPO trials. | -| freqai.optuna_hyperopt.n_jobs | CPU threads / 4 | int >= 1 | Parallel HPO workers. | -| freqai.optuna_hyperopt.timeout | 7200 | int >= 0 | HPO wall-clock timeout in seconds. | -| freqai.optuna_hyperopt.label_candles_step | 1 | int >= 1 | Step for Zigzag NATR horizon `label` search space. | -| freqai.optuna_hyperopt.space_reduction | false | bool | Enable/disable `hp` search space reduction based on previous best parameters. | -| freqai.optuna_hyperopt.space_fraction | 0.4 | float [0,1] | Fraction of the `hp` search space to use with `space_reduction`. Lower values create narrower search ranges around the best parameters. (Deprecated alias: `freqai.optuna_hyperopt.expansion_ratio`) | -| freqai.optuna_hyperopt.min_resource | 3 | int >= 1 | Minimum resource per [HyperbandPruner](https://optuna.readthedocs.io/en/stable/reference/generated/optuna.pruners.HyperbandPruner.html) rung. | -| freqai.optuna_hyperopt.seed | 1 | int >= 0 | HPO RNG seed. | +| Path | Default | Type / Range | Description | +| -------------------------------------------------------------- | ----------------------------- | -------------------------------------------------------------------------------------------------------------------------------------------- | -------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | +| _Protections_ | | | | +| custom_protections.trade_duration_candles | 72 | int >= 1 | Estimated trade duration in candles. Scales protections stop duration candles and trade limit. | +| custom_protections.lookback_period_fraction | 0.5 | float (0,1] | Fraction of `fit_live_predictions_candles` used to calculate `lookback_period_candles` for _MaxDrawdown_ and _StoplossGuard_ protections. | +| custom_protections.cooldown.enabled | true | bool | Enable/disable _CooldownPeriod_ protection. | +| custom_protections.cooldown.stop_duration_candles | 4 | int >= 1 | Number of candles to wait before allowing new trades after a trade is closed. | +| custom_protections.drawdown.enabled | true | bool | Enable/disable _MaxDrawdown_ protection. | +| custom_protections.drawdown.max_allowed_drawdown | 0.2 | float (0,1) | Maximum allowed drawdown. | +| custom_protections.stoploss.enabled | true | bool | Enable/disable _StoplossGuard_ protection. | +| _Leverage_ | | | | +| leverage | `proposed_leverage` | float [1.0, max_leverage] | Leverage. Fallback to `proposed_leverage` for the pair. | +| _Exit pricing_ | | | | +| exit_pricing.trade_price_target_method | `moving_average` | enum {`moving_average`,`quantile_interpolation`,`weighted_average`} | Trade NATR computation method. (Deprecated alias: `exit_pricing.trade_price_target`) | +| exit_pricing.thresholds_calibration.decline_quantile | 0.75 | float (0,1) | PnL decline quantile threshold. | +| _Reversal confirmation_ | | | | +| reversal_confirmation.lookback_period_candles | 0 | int >= 0 | Prior confirming candles; 0 = none. (Deprecated alias: `reversal_confirmation.lookback_period`) | +| reversal_confirmation.decay_fraction | 0.5 | float (0,1] | Geometric per-candle volatility adjusted reversal threshold relaxation factor. (Deprecated alias: `reversal_confirmation.decay_ratio`) | +| reversal_confirmation.min_natr_multiplier_fraction | 0.0095 | float [0,1] | Lower bound fraction for volatility adjusted reversal threshold. (Deprecated alias: `reversal_confirmation.min_natr_ratio_percent`) | +| reversal_confirmation.max_natr_multiplier_fraction | 0.075 | float [0,1] | Upper bound fraction (>= lower bound) for volatility adjusted reversal threshold. (Deprecated alias: `reversal_confirmation.max_natr_ratio_percent`) | +| _Regressor model_ | | | | +| freqai.regressor | `xgboost` | enum {`xgboost`,`lightgbm`,`histgradientboostingregressor`} | Machine learning regressor algorithm. | +| _Extrema smoothing_ | | | | +| freqai.extrema_smoothing.method | `gaussian` | enum {`gaussian`,`kaiser`,`triang`,`smm`,`sma`,`savgol`,`gaussian_filter1d`} | Extrema smoothing method (`smm`=median, `sma`=mean, `savgol`=Savitzky–Golay). | +| freqai.extrema_smoothing.window_candles | 5 | int >= 3 | Smoothing window length (candles). (Deprecated alias: `freqai.extrema_smoothing.window`) | +| freqai.extrema_smoothing.beta | 8.0 | float > 0 | Shape parameter for `kaiser` kernel. | +| freqai.extrema_smoothing.polyorder | 3 | int >= 1 | Polynomial order for `savgol` smoothing. | +| freqai.extrema_smoothing.mode | `mirror` | enum {`mirror`,`constant`,`nearest`,`wrap`,`interp`} | Boundary mode for `savgol` and `gaussian_filter1d`. | +| freqai.extrema_smoothing.sigma | 1.0 | float > 0 | Gaussian `sigma` for `gaussian_filter1d` smoothing. | +| _Extrema weighting_ | | | | +| freqai.extrema_weighting.strategy | `none` | enum {`none`,`amplitude`,`amplitude_threshold_ratio`,`volume_rate`,`speed`,`efficiency_ratio`,`volume_weighted_efficiency_ratio`,`combined`} | Extrema weighting source: unweighted (`none`), swing amplitude (`amplitude`), swing amplitude / median volatility-threshold ratio (`amplitude_threshold_ratio`), swing volume per candle (`volume_rate`), swing speed (`speed`), swing efficiency ratio (`efficiency_ratio`), swing volume-weighted efficiency ratio (`volume_weighted_efficiency_ratio`), or combined metrics aggregation (`combined`). | +| freqai.extrema_weighting.metric_coefficients | {} | dict[str, float] | Per-metric coefficients for `combined` strategy. Keys: `amplitude`, `amplitude_threshold_ratio`, `volume_rate`, `speed`, `efficiency_ratio`, `volume_weighted_efficiency_ratio`. | +| freqai.extrema_weighting.aggregation | `weighted_average` | enum {`weighted_average`,`geometric_mean`} | Metric aggregation method for `combined` strategy. `weighted_average`=Σ(coef·metric)/Σ(coef), `geometric_mean`=∏(metric^coef)^(1/Σcoef). | +| freqai.extrema_weighting.standardization | `none` | enum {`none`,`zscore`,`robust`,`mmad`,`power_yj`} | Standardization method applied to smoothed weighted extrema before normalization. `none`=w, `zscore`=(w-μ)/σ, `robust`=(w-median)/IQR, `mmad`=(w-median)/(MAD·k), `power_yj`=YJ(w). | +| freqai.extrema_weighting.robust_quantiles | [0.25, 0.75] | list[float] where 0 <= Q1 < Q3 <= 1 | Quantile range for robust standardization, Q1 and Q3. | +| freqai.extrema_weighting.mmad_scaling_factor | 1.4826 | float > 0 | Scaling factor for MMAD standardization. | +| freqai.extrema_weighting.normalization | `maxabs` | enum {`maxabs`,`minmax`,`sigmoid`,`none`} | Normalization method applied to smoothed weighted extrema. `maxabs`=w/max(\|w\|), `minmax`=low+(w-min)/(max-min)·(high-low), `sigmoid`=2·σ(scale·w)-1, `none`=w. | +| freqai.extrema_weighting.minmax_range | [-1.0, 1.0] | list[float] | Target range for `minmax` normalization, min and max. | +| freqai.extrema_weighting.sigmoid_scale | 1.0 | float > 0 | Scale parameter for `sigmoid` normalization, controls steepness. | +| freqai.extrema_weighting.gamma | 1.0 | float (0,10] | Contrast exponent applied to smoothed weighted extrema after normalization: >1 emphasizes extrema, values between 0 and 1 soften. | +| _Feature parameters_ | | | | +| freqai.feature_parameters.label_period_candles | min/max midpoint | int >= 1 | Zigzag labeling NATR horizon. | +| freqai.feature_parameters.min_label_period_candles | 12 | int >= 1 | Minimum labeling NATR horizon used for reversals labeling HPO. | +| freqai.feature_parameters.max_label_period_candles | 24 | int >= 1 | Maximum labeling NATR horizon used for reversals labeling HPO. | +| freqai.feature_parameters.label_natr_multiplier | min/max midpoint | float > 0 | Zigzag labeling NATR multiplier. (Deprecated alias: `freqai.feature_parameters.label_natr_ratio`) | +| freqai.feature_parameters.min_label_natr_multiplier | 9.0 | float > 0 | Minimum labeling NATR multiplier used for reversals labeling HPO. (Deprecated alias: `freqai.feature_parameters.min_label_natr_ratio`) | +| freqai.feature_parameters.max_label_natr_multiplier | 12.0 | float > 0 | Maximum labeling NATR multiplier used for reversals labeling HPO. (Deprecated alias: `freqai.feature_parameters.max_label_natr_ratio`) | +| freqai.feature_parameters.label_frequency_candles | `auto` | int >= 2 \| `auto` | Reversals labeling frequency. `auto` = max(2, 2 \* number of whitelisted pairs). | +| freqai.feature_parameters.label_weights | [1/7,1/7,1/7,1/7,1/7,1/7,1/7] | list[float] | Per-objective weights used in distance calculations to ideal point. Objectives: (1) number of detected reversals, (2) median swing amplitude, (3) median (swing amplitude / median volatility-threshold ratio), (4) median swing volume per candle, (5) median swing speed, (6) median swing efficiency ratio, (7) median swing volume-weighted efficiency ratio. | +| freqai.feature_parameters.label_p_order | `None` | float \| None | p-order parameter for distance metrics. Used by minkowski (default 2.0) and power_mean (default 1.0). Ignored by other metrics. | +| freqai.feature_parameters.label_method | `compromise_programming` | enum {`compromise_programming`,`topsis`,`kmeans`,`kmeans2`,`kmedoids`,`knn`,`medoid`} | HPO `label` Pareto front trial selection method. | +| freqai.feature_parameters.label_distance_metric | `euclidean` | string | Distance metric for `compromise_programming` and `topsis` methods. | +| freqai.feature_parameters.label_cluster_metric | `euclidean` | string | Distance metric for `kmeans`, `kmeans2`, and `kmedoids` methods. | +| freqai.feature_parameters.label_cluster_selection_method | `topsis` | enum {`compromise_programming`,`topsis`} | Cluster selection method for clustering-based label methods. | +| freqai.feature_parameters.label_cluster_trial_selection_method | `topsis` | enum {`compromise_programming`,`topsis`} | Best cluster trial selection method for clustering-based label methods. | +| freqai.feature_parameters.label_density_metric | method-dependent | string | Distance metric for `knn` and `medoid` methods. | +| freqai.feature_parameters.label_density_aggregation | `power_mean` | enum {`power_mean`,`quantile`,`min`,`max`} | Aggregation method for KNN neighbor distances. | +| freqai.feature_parameters.label_density_n_neighbors | 5 | int >= 1 | Number of neighbors for KNN. | +| freqai.feature_parameters.label_density_aggregation_param | aggregation-dependent | float \| None | Tunable for KNN neighbor distance aggregation: p-order (`power_mean`) or quantile value (`quantile`). | +| _Predictions extrema_ | | | | +| freqai.predictions_extrema.selection_method | `rank_extrema` | enum {`rank_extrema`,`rank_peaks`,`partition`} | Extrema selection method. `rank_extrema` ranks extrema values, `rank_peaks` ranks detected peak values, `partition` uses sign-based partitioning. | +| freqai.predictions_extrema.threshold_smoothing_method | `mean` | enum {`mean`,`isodata`,`li`,`minimum`,`otsu`,`triangle`,`yen`,`median`,`soft_extremum`} | Thresholding method for prediction thresholds smoothing. (Deprecated alias: `freqai.predictions_extrema.thresholds_smoothing`) | +| freqai.predictions_extrema.soft_extremum_alpha | 12.0 | float >= 0 | Alpha for `soft_extremum` thresholds smoothing. (Deprecated alias: `freqai.predictions_extrema.thresholds_alpha`) | +| freqai.predictions_extrema.outlier_threshold_quantile | 0.999 | float (0,1) | Quantile threshold for predictions outlier filtering. (Deprecated alias: `freqai.predictions_extrema.threshold_outlier`) | +| freqai.predictions_extrema.keep_extrema_fraction | 1.0 | float (0,1] | Fraction of extrema used for thresholds. `1.0` uses all, lower values keep only most significant. Applies to `rank_extrema` and `rank_peaks`; ignored for `partition`. (Deprecated alias: `freqai.predictions_extrema.extrema_fraction`) | +| _Optuna / HPO_ | | | | +| freqai.optuna_hyperopt.enabled | false | bool | Enables HPO. | +| freqai.optuna_hyperopt.sampler | `tpe` | enum {`tpe`,`auto`} | HPO sampler algorithm for `hp` namespace. `tpe` uses [TPESampler](https://optuna.readthedocs.io/en/stable/reference/samplers/generated/optuna.samplers.TPESampler.html) with multivariate and group, `auto` uses [AutoSampler](https://hub.optuna.org/samplers/auto_sampler). | +| freqai.optuna_hyperopt.label_sampler | `auto` | enum {`auto`,`tpe`,`nsgaii`,`nsgaiii`} | HPO sampler algorithm for multi-objective `label` namespace. `nsgaii` uses [NSGAIISampler](https://optuna.readthedocs.io/en/stable/reference/samplers/generated/optuna.samplers.NSGAIISampler.html), `nsgaiii` uses [NSGAIIISampler](https://optuna.readthedocs.io/en/stable/reference/samplers/generated/optuna.samplers.NSGAIIISampler.html). | +| freqai.optuna_hyperopt.storage | `file` | enum {`file`,`sqlite`} | HPO storage backend. | +| freqai.optuna_hyperopt.continuous | true | bool | Continuous HPO. | +| freqai.optuna_hyperopt.warm_start | true | bool | Warm start HPO with previous best value(s). | +| freqai.optuna_hyperopt.n_startup_trials | 15 | int >= 0 | HPO startup trials. | +| freqai.optuna_hyperopt.n_trials | 50 | int >= 1 | Maximum HPO trials. | +| freqai.optuna_hyperopt.n_jobs | CPU threads / 4 | int >= 1 | Parallel HPO workers. | +| freqai.optuna_hyperopt.timeout | 7200 | int >= 0 | HPO wall-clock timeout in seconds. | +| freqai.optuna_hyperopt.label_candles_step | 1 | int >= 1 | Step for Zigzag NATR horizon `label` search space. | +| freqai.optuna_hyperopt.space_reduction | false | bool | Enable/disable `hp` search space reduction based on previous best parameters. | +| freqai.optuna_hyperopt.space_fraction | 0.4 | float [0,1] | Fraction of the `hp` search space to use with `space_reduction`. Lower values create narrower search ranges around the best parameters. (Deprecated alias: `freqai.optuna_hyperopt.expansion_ratio`) | +| freqai.optuna_hyperopt.min_resource | 3 | int >= 1 | Minimum resource per [HyperbandPruner](https://optuna.readthedocs.io/en/stable/reference/generated/optuna.pruners.HyperbandPruner.html) rung. | +| freqai.optuna_hyperopt.seed | 1 | int >= 0 | HPO RNG seed. | ## ReforceXY diff --git a/quickadapter/user_data/freqaimodels/QuickAdapterRegressorV3.py b/quickadapter/user_data/freqaimodels/QuickAdapterRegressorV3.py index de41cbd..34cf60c 100644 --- a/quickadapter/user_data/freqaimodels/QuickAdapterRegressorV3.py +++ b/quickadapter/user_data/freqaimodels/QuickAdapterRegressorV3.py @@ -81,7 +81,7 @@ class QuickAdapterRegressorV3(BaseRegressionModel): https://github.com/sponsors/robcaulk """ - version = "3.10.1" + version = "3.10.2" _TEST_SIZE: Final[float] = 0.1 diff --git a/quickadapter/user_data/strategies/ExtremaWeightingTransformer.py b/quickadapter/user_data/strategies/ExtremaWeightingTransformer.py index ec02507..c4ba318 100644 --- a/quickadapter/user_data/strategies/ExtremaWeightingTransformer.py +++ b/quickadapter/user_data/strategies/ExtremaWeightingTransformer.py @@ -24,7 +24,32 @@ WeightStrategy = Literal[ "speed", "efficiency_ratio", "volume_weighted_efficiency_ratio", + "combined", ] + +CombinedMetric = Literal[ + "amplitude", + "amplitude_threshold_ratio", + "volume_rate", + "speed", + "efficiency_ratio", + "volume_weighted_efficiency_ratio", +] +COMBINED_METRICS: Final[tuple[CombinedMetric, ...]] = ( + "amplitude", + "amplitude_threshold_ratio", + "volume_rate", + "speed", + "efficiency_ratio", + "volume_weighted_efficiency_ratio", +) + +CombinedAggregation = Literal["weighted_average", "geometric_mean"] +COMBINED_AGGREGATIONS: Final[tuple[CombinedAggregation, ...]] = ( + "weighted_average", + "geometric_mean", +) + WEIGHT_STRATEGIES: Final[tuple[WeightStrategy, ...]] = ( "none", "amplitude", @@ -33,6 +58,7 @@ WEIGHT_STRATEGIES: Final[tuple[WeightStrategy, ...]] = ( "speed", "efficiency_ratio", "volume_weighted_efficiency_ratio", + "combined", ) StandardizationType = Literal["none", "zscore", "robust", "mmad", "power_yj"] @@ -54,6 +80,8 @@ NORMALIZATION_TYPES: Final[tuple[NormalizationType, ...]] = ( DEFAULTS_EXTREMA_WEIGHTING: Final[dict[str, Any]] = { "strategy": WEIGHT_STRATEGIES[0], # "none" + "metric_coefficients": {}, + "aggregation": COMBINED_AGGREGATIONS[0], # "weighted_average" # Phase 1: Standardization "standardization": STANDARDIZATION_TYPES[0], # "none" "robust_quantiles": (0.25, 0.75), diff --git a/quickadapter/user_data/strategies/QuickAdapterV3.py b/quickadapter/user_data/strategies/QuickAdapterV3.py index 3713dc5..83b0dd6 100644 --- a/quickadapter/user_data/strategies/QuickAdapterV3.py +++ b/quickadapter/user_data/strategies/QuickAdapterV3.py @@ -106,7 +106,7 @@ class QuickAdapterV3(IStrategy): _PLOT_EXTREMA_MIN_EPS: Final[float] = 0.01 def version(self) -> str: - return "3.10.1" + return "3.10.2" timeframe = "5m" timeframe_minutes = timeframe_to_minutes(timeframe) @@ -545,6 +545,10 @@ class QuickAdapterV3(IStrategy): logger.info("Extrema Weighting:") logger.info(f" strategy: {self.extrema_weighting['strategy']}") + logger.info( + f" metric_coefficients: {self.extrema_weighting['metric_coefficients']}" + ) + logger.info(f" aggregation: {self.extrema_weighting['aggregation']}") logger.info(f" standardization: {self.extrema_weighting['standardization']}") logger.info( f" robust_quantiles: ({format_number(self.extrema_weighting['robust_quantiles'][0])}, {format_number(self.extrema_weighting['robust_quantiles'][1])})" @@ -909,7 +913,7 @@ class QuickAdapterV3(IStrategy): speeds=pivots_speeds, efficiency_ratios=pivots_efficiency_ratios, volume_weighted_efficiency_ratios=pivots_volume_weighted_efficiency_ratios, - strategy=self.extrema_weighting["strategy"], + extrema_weighting=self.extrema_weighting, ) plot_eps = weighted_extrema.abs().where(weighted_extrema.ne(0.0)).min() diff --git a/quickadapter/user_data/strategies/Utils.py b/quickadapter/user_data/strategies/Utils.py index 845161c..c342359 100644 --- a/quickadapter/user_data/strategies/Utils.py +++ b/quickadapter/user_data/strategies/Utils.py @@ -22,11 +22,14 @@ import pandas as pd import scipy as sp import talib.abstract as ta from ExtremaWeightingTransformer import ( + COMBINED_AGGREGATIONS, + COMBINED_METRICS, DEFAULTS_EXTREMA_WEIGHTING, NORMALIZATION_TYPES, STANDARDIZATION_TYPES, WEIGHT_STRATEGIES, - WeightStrategy, + CombinedAggregation, + CombinedMetric, ) from numpy.typing import NDArray from scipy.ndimage import gaussian_filter1d @@ -51,6 +54,7 @@ SMOOTHING_KERNELS: Final[tuple[SmoothingKernel, ...]] = ( "kaiser", "triang", ) + SmoothingMethod = Union[ SmoothingKernel, Literal["smm", "sma", "savgol", "gaussian_filter1d"] ] @@ -107,6 +111,30 @@ def get_extrema_weighting_config( f"Invalid extrema_weighting strategy {strategy!r}, supported: {', '.join(WEIGHT_STRATEGIES)}, using default {WEIGHT_STRATEGIES[0]!r}" ) strategy = WEIGHT_STRATEGIES[0] + metric_coefficients = extrema_weighting.get( + "metric_coefficients", DEFAULTS_EXTREMA_WEIGHTING["metric_coefficients"] + ) + if not isinstance(metric_coefficients, dict): + logger.warning( + f"Invalid extrema_weighting metric_coefficients {metric_coefficients!r}: must be a mapping, using default {DEFAULTS_EXTREMA_WEIGHTING['metric_coefficients']!r}" + ) + metric_coefficients = DEFAULTS_EXTREMA_WEIGHTING["metric_coefficients"] + elif invalid_keys := set(metric_coefficients.keys()) - set(COMBINED_METRICS): + logger.warning( + f"Invalid extrema_weighting metric_coefficients keys {sorted(invalid_keys)!r}, valid keys: {', '.join(COMBINED_METRICS)}" + ) + metric_coefficients = { + k: v for k, v in metric_coefficients.items() if k in set(COMBINED_METRICS) + } + + aggregation: CombinedAggregation = extrema_weighting.get( + "aggregation", DEFAULTS_EXTREMA_WEIGHTING["aggregation"] + ) + if aggregation not in set(COMBINED_AGGREGATIONS): + logger.warning( + f"Invalid extrema_weighting aggregation {aggregation!r}, supported: {', '.join(COMBINED_AGGREGATIONS)}, using default {DEFAULTS_EXTREMA_WEIGHTING['aggregation']!r}" + ) + aggregation = DEFAULTS_EXTREMA_WEIGHTING["aggregation"] # Phase 1: Standardization standardization = extrema_weighting.get( @@ -220,6 +248,8 @@ def get_extrema_weighting_config( return { "strategy": strategy, + "metric_coefficients": metric_coefficients, + "aggregation": aggregation, # Phase 1: Standardization "standardization": standardization, "robust_quantiles": robust_quantiles, @@ -472,6 +502,92 @@ def _build_weights_array( return weights_array +def _parse_metric_coefficients( + metric_coefficients: dict[str, Any], +) -> dict[CombinedMetric, float]: + out: dict[CombinedMetric, float] = {} + for metric in COMBINED_METRICS: + value = metric_coefficients.get(metric) + if not isinstance(value, (int, float)): + continue + if not np.isfinite(value) or value <= 0: + continue + out[metric] = float(value) + + return out + + +def _aggregate_metrics( + stacked_metrics: NDArray[np.floating], + coefficients: NDArray[np.floating], + aggregation: CombinedAggregation, +) -> NDArray[np.floating]: + if aggregation == COMBINED_AGGREGATIONS[0]: # "weighted_average" + return np.average(stacked_metrics, axis=0, weights=coefficients) + elif aggregation == COMBINED_AGGREGATIONS[1]: # "geometric_mean" + return np.asarray( + sp.stats.gmean(stacked_metrics.T, weights=coefficients, axis=1), + dtype=float, + ) + else: + raise ValueError( + f"Invalid aggregation {aggregation!r}. Supported: {', '.join(COMBINED_AGGREGATIONS)}" + ) + + +def _compute_combined_weights( + indices: list[int], + amplitudes: list[float], + amplitude_threshold_ratios: list[float], + volume_rates: list[float], + speeds: list[float], + efficiency_ratios: list[float], + volume_weighted_efficiency_ratios: list[float], + metric_coefficients: dict[str, Any], + aggregation: CombinedAggregation, +) -> NDArray[np.floating]: + if len(indices) == 0: + return np.asarray([], dtype=float) + + coefficients = _parse_metric_coefficients(metric_coefficients) + if len(coefficients) == 0: + coefficients = dict.fromkeys(COMBINED_METRICS, DEFAULT_EXTREMA_WEIGHT) + + metrics: dict[CombinedMetric, NDArray[np.floating]] = { + "amplitude": np.asarray(amplitudes, dtype=float), + "amplitude_threshold_ratio": np.asarray( + amplitude_threshold_ratios, dtype=float + ), + "volume_rate": np.asarray(volume_rates, dtype=float), + "speed": np.asarray(speeds, dtype=float), + "efficiency_ratio": np.asarray(efficiency_ratios, dtype=float), + "volume_weighted_efficiency_ratio": np.asarray( + volume_weighted_efficiency_ratios, dtype=float + ), + } + + imputed_metrics: list[NDArray[np.floating]] = [] + coefficients_list: list[float] = [] + + for metric_name in COMBINED_METRICS: + if metric_name not in coefficients: + continue + coefficient = coefficients[metric_name] + metric_values = metrics[metric_name] + if metric_values.size == 0: + continue + imputed_metrics.append(_impute_weights(weights=metric_values)) + coefficients_list.append(float(coefficient)) + + if len(imputed_metrics) == 0: + return np.asarray([], dtype=float) + + stacked_metrics = np.vstack(imputed_metrics) + coefficients_array = np.asarray(coefficients_list, dtype=float) + + return _aggregate_metrics(stacked_metrics, coefficients_array, aggregation) + + def compute_extrema_weights( n_extrema: int, indices: list[int], @@ -481,60 +597,56 @@ def compute_extrema_weights( speeds: list[float], efficiency_ratios: list[float], volume_weighted_efficiency_ratios: list[float], - strategy: WeightStrategy = DEFAULTS_EXTREMA_WEIGHTING["strategy"], + extrema_weighting: dict[str, Any], ) -> NDArray[np.floating]: + extrema_weighting = {**DEFAULTS_EXTREMA_WEIGHTING, **extrema_weighting} + strategy = extrema_weighting["strategy"] + if len(indices) == 0 or strategy == WEIGHT_STRATEGIES[0]: # "none" return np.full(n_extrema, DEFAULT_EXTREMA_WEIGHT, dtype=float) weights: Optional[NDArray[np.floating]] = None - if ( - strategy - in { - WEIGHT_STRATEGIES[1], - WEIGHT_STRATEGIES[2], - WEIGHT_STRATEGIES[3], - WEIGHT_STRATEGIES[4], - WEIGHT_STRATEGIES[5], - WEIGHT_STRATEGIES[6], - } - ): # "amplitude" / "amplitude_threshold_ratio" / "volume_rate" / "speed" / "efficiency_ratio" / "volume_weighted_efficiency_ratio" - if strategy == WEIGHT_STRATEGIES[1]: # "amplitude" - weights = np.asarray(amplitudes, dtype=float) - elif strategy == WEIGHT_STRATEGIES[2]: # "amplitude_threshold_ratio" - weights = np.asarray(amplitude_threshold_ratios, dtype=float) - elif strategy == WEIGHT_STRATEGIES[3]: # "volume_rate" - weights = np.asarray(volume_rates, dtype=float) - elif strategy == WEIGHT_STRATEGIES[4]: # "speed" - weights = np.asarray(speeds, dtype=float) - elif strategy == WEIGHT_STRATEGIES[5]: # "efficiency_ratio" - weights = np.asarray(efficiency_ratios, dtype=float) - elif strategy == WEIGHT_STRATEGIES[6]: # "volume_weighted_efficiency_ratio" - weights = np.asarray(volume_weighted_efficiency_ratios, dtype=float) - else: - weights = np.asarray([], dtype=float) - - if weights.size == 0: - return np.full(n_extrema, DEFAULT_EXTREMA_WEIGHT, dtype=float) - - weights = _impute_weights( - weights=weights, + if strategy == WEIGHT_STRATEGIES[1]: # "amplitude" + weights = np.asarray(amplitudes, dtype=float) + elif strategy == WEIGHT_STRATEGIES[2]: # "amplitude_threshold_ratio" + weights = np.asarray(amplitude_threshold_ratios, dtype=float) + elif strategy == WEIGHT_STRATEGIES[3]: # "volume_rate" + weights = np.asarray(volume_rates, dtype=float) + elif strategy == WEIGHT_STRATEGIES[4]: # "speed" + weights = np.asarray(speeds, dtype=float) + elif strategy == WEIGHT_STRATEGIES[5]: # "efficiency_ratio" + weights = np.asarray(efficiency_ratios, dtype=float) + elif strategy == WEIGHT_STRATEGIES[6]: # "volume_weighted_efficiency_ratio" + weights = np.asarray(volume_weighted_efficiency_ratios, dtype=float) + elif strategy == WEIGHT_STRATEGIES[7]: # "combined" + weights = _compute_combined_weights( + indices=indices, + amplitudes=amplitudes, + amplitude_threshold_ratios=amplitude_threshold_ratios, + volume_rates=volume_rates, + speeds=speeds, + efficiency_ratios=efficiency_ratios, + volume_weighted_efficiency_ratios=volume_weighted_efficiency_ratios, + metric_coefficients=extrema_weighting["metric_coefficients"], + aggregation=extrema_weighting["aggregation"], ) - if weights is not None: - if weights.size == 0: - return np.full(n_extrema, DEFAULT_EXTREMA_WEIGHT, dtype=float) - - return _build_weights_array( - n_extrema=n_extrema, - indices=indices, - weights=weights, - default_weight=np.nanmedian(weights), + else: + raise ValueError( + f"Invalid extrema weighting strategy {strategy!r}. " + f"Supported: {', '.join(WEIGHT_STRATEGIES)}" ) - raise ValueError( - f"Invalid extrema weighting strategy {strategy!r}. " - f"Supported: {', '.join(WEIGHT_STRATEGIES)}" + weights = _impute_weights( + weights=weights, + ) + + return _build_weights_array( + n_extrema=n_extrema, + indices=indices, + weights=weights, + default_weight=float(np.nanmedian(weights)), ) @@ -565,7 +677,7 @@ def get_weighted_extrema( speeds: list[float], efficiency_ratios: list[float], volume_weighted_efficiency_ratios: list[float], - strategy: WeightStrategy = DEFAULTS_EXTREMA_WEIGHTING["strategy"], + extrema_weighting: dict[str, Any], ) -> tuple[pd.Series, pd.Series]: extrema_values = extrema.to_numpy(dtype=float) extrema_index = extrema.index @@ -580,7 +692,7 @@ def get_weighted_extrema( speeds=speeds, efficiency_ratios=efficiency_ratios, volume_weighted_efficiency_ratios=volume_weighted_efficiency_ratios, - strategy=strategy, + extrema_weighting=extrema_weighting, ) return pd.Series( @@ -1101,21 +1213,20 @@ def zigzag( return np.nan, np.nan amplitude = abs(current_value - previous_value) / abs(previous_value) + if not (np.isfinite(amplitude) and amplitude >= 0): + return np.nan, np.nan start_pos = min(previous_pos, current_pos) end_pos = max(previous_pos, current_pos) + 1 median_threshold = np.nanmedian(thresholds[start_pos:end_pos]) - if ( - np.isfinite(median_threshold) - and median_threshold > 0 - and np.isfinite(amplitude) - ): - amplitude_threshold_ratio = amplitude / median_threshold - else: - amplitude_threshold_ratio = np.nan + amplitude_threshold_ratio = ( + amplitude / (amplitude + median_threshold) + if np.isfinite(median_threshold) and median_threshold > 0 + else np.nan + ) - return amplitude, amplitude_threshold_ratio + return amplitude / (1.0 + amplitude), amplitude_threshold_ratio def calculate_pivot_duration( *, @@ -1143,37 +1254,50 @@ def zigzag( previous_pos=previous_pos, current_pos=current_pos, ) - if not np.isfinite(duration) or duration == 0: return np.nan start_pos = min(previous_pos, current_pos) end_pos = max(previous_pos, current_pos) + 1 - total_volume = np.nansum(volumes[start_pos:end_pos]) - return total_volume / duration + avg_volume_per_candle = np.nansum(volumes[start_pos:end_pos]) / duration + median_volume = np.nanmedian(volumes[start_pos:end_pos]) + if ( + np.isfinite(avg_volume_per_candle) + and avg_volume_per_candle >= 0 + and np.isfinite(median_volume) + and median_volume > 0 + ): + return avg_volume_per_candle / (avg_volume_per_candle + median_volume) + return np.nan def calculate_pivot_speed( *, previous_pos: int, + previous_value: float, current_pos: int, - amplitude: float, + current_value: float, ) -> float: if previous_pos < 0 or current_pos < 0: return np.nan if previous_pos >= n or current_pos >= n: return np.nan - if not np.isfinite(amplitude): + + if np.isclose(previous_value, 0.0): return np.nan duration = calculate_pivot_duration( previous_pos=previous_pos, current_pos=current_pos, ) - if not np.isfinite(duration) or duration == 0: return np.nan - return amplitude / duration + amplitude = abs(current_value - previous_value) / abs(previous_value) + if not (np.isfinite(amplitude) and amplitude >= 0): + return np.nan + + speed = amplitude / duration + return speed / (1.0 + speed) if np.isfinite(speed) and speed >= 0 else np.nan def calculate_pivot_efficiency_ratio( *, @@ -1259,8 +1383,9 @@ def zigzag( ) speed = calculate_pivot_speed( previous_pos=last_pivot_pos, + previous_value=pivots_values[-1], current_pos=pos, - amplitude=amplitude, + current_value=value, ) efficiency_ratio = calculate_pivot_efficiency_ratio( previous_pos=last_pivot_pos,