### Configuration tunables
-| Path | Default | Type / Range | Description |
-| ---------------------------------------------------- | ------------------------- | -------------------------------------------------------------------------------------------------------------------------------- | ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
-| _Protections_ | | | |
-| custom_protections.trade_duration_candles | 72 | int >= 1 | Estimated trade duration in candles. Scales protections stop duration candles and trade limit. |
-| custom_protections.lookback_period_fraction | 0.5 | float (0,1] | Fraction of `fit_live_predictions_candles` used to calculate `lookback_period_candles` for _MaxDrawdown_ and _StoplossGuard_ protections. |
-| custom_protections.cooldown.enabled | true | bool | Enable/disable _CooldownPeriod_ protection. |
-| custom_protections.cooldown.stop_duration_candles | 4 | int >= 1 | Number of candles to wait before allowing new trades after a trade is closed. |
-| custom_protections.drawdown.enabled | true | bool | Enable/disable _MaxDrawdown_ protection. |
-| custom_protections.drawdown.max_allowed_drawdown | 0.2 | float (0,1) | Maximum allowed drawdown. |
-| custom_protections.stoploss.enabled | true | bool | Enable/disable _StoplossGuard_ protection. |
-| _Leverage_ | | | |
-| leverage | `proposed_leverage` | float [1.0, max_leverage] | Leverage. Fallback to `proposed_leverage` for the pair. |
-| _Exit pricing_ | | | |
-| exit_pricing.trade_price_target | `moving_average` | enum {`moving_average`,`interpolation`,`weighted_interpolation`} | Trade NATR computation method. |
-| exit_pricing.thresholds_calibration.decline_quantile | 0.90 | float (0,1) | PnL decline quantile threshold. |
-| _Reversal confirmation_ | | | |
-| reversal_confirmation.lookback_period | 0 | int >= 0 | Prior confirming candles; 0 = none. |
-| reversal_confirmation.decay_ratio | 0.5 | float (0,1] | Geometric per-candle volatility adjusted reversal threshold relaxation factor. |
-| reversal_confirmation.min_natr_ratio_percent | 0.0095 | float [0,1] | Lower bound fraction for volatility adjusted reversal threshold. |
-| reversal_confirmation.max_natr_ratio_percent | 0.075 | float [0,1] | Upper bound fraction (>= lower bound) for volatility adjusted reversal threshold. |
-| _Regressor model_ | | | |
-| freqai.regressor | `xgboost` | enum {`xgboost`,`lightgbm`} | Machine learning regressor algorithm. |
-| _Extrema smoothing_ | | | |
-| freqai.extrema_smoothing.method | `gaussian` | enum {`gaussian`,`kaiser`,`triang`,`smm`,`sma`,`savgol`,`nadaraya_watson`} | Extrema smoothing method (`smm`=median, `sma`=mean, `savgol`=Savitzky–Golay, `nadaraya_watson`=Gaussian kernel regression). |
-| freqai.extrema_smoothing.window | 5 | int >= 3 | Smoothing window length (candles). |
-| freqai.extrema_smoothing.beta | 8.0 | float > 0 | Shape parameter for `kaiser` kernel. |
-| freqai.extrema_smoothing.polyorder | 3 | int >= 1 | Polynomial order for `savgol` smoothing. |
-| freqai.extrema_smoothing.mode | `mirror` | enum {`mirror`,`constant`,`nearest`,`wrap`,`interp`} | Boundary mode for `savgol` and `nadaraya_watson`. |
-| freqai.extrema_smoothing.bandwidth | 1.0 | float > 0 | Gaussian bandwidth for `nadaraya_watson` smoothing. |
-| _Extrema weighting_ | | | |
-| freqai.extrema_weighting.strategy | `none` | enum {`none`,`amplitude`,`amplitude_threshold_ratio`,`volume`,`speed`,`efficiency_ratio`,`hybrid`} | Extrema weighting source: unweighted (`none`), swing amplitude (`amplitude`), swing amplitude / median volatility-threshold ratio (`amplitude_threshold_ratio`), swing volume (`volume`), swing speed (`speed`), swing efficiency ratio (`efficiency_ratio`), or `hybrid`. |
-| freqai.extrema_weighting.source_weights | `{}` | dict[str, float] | Weights on extrema extrema weighting sources for `hybrid`. |
-| freqai.extrema_weighting.aggregation | `weighted_sum` | enum {`weighted_sum`,`geometric_mean`} | Aggregation method applied to weighted extrema weighting sources for `hybrid`. |
-| freqai.extrema_weighting.aggregation_normalization | `none` | enum {`minmax`,`sigmoid`,`softmax`,`l1`,`l2`,`rank`,`none`} | Normalization method applied to the aggregated extrema weighting source for `hybrid`. |
-| freqai.extrema_weighting.standardization | `none` | enum {`none`,`zscore`,`robust`,`mmad`} | Standardization method applied to weights before normalization. `none`=no standardization, `zscore`=(w-μ)/σ, `robust`=(w-median)/IQR, `mmad`=(w-median)/MAD. |
-| freqai.extrema_weighting.robust_quantiles | [0.25, 0.75] | list[float] where 0 <= Q1 < Q3 <= 1 | Quantile range for robust standardization, Q1 and Q3. |
-| freqai.extrema_weighting.mmad_scaling_factor | 1.4826 | float > 0 | Scaling factor for MMAD standardization. |
-| freqai.extrema_weighting.normalization | `minmax` | enum {`minmax`,`sigmoid`,`softmax`,`l1`,`l2`,`rank`,`none`} | Normalization method applied to weights. |
-| freqai.extrema_weighting.minmax_range | [0.0, 1.0] | list[float] | Target range for `minmax` normalization, min and max. |
-| freqai.extrema_weighting.sigmoid_scale | 1.0 | float > 0 | Scale parameter for `sigmoid` normalization, controls steepness. |
-| freqai.extrema_weighting.softmax_temperature | 1.0 | float > 0 | Temperature parameter for `softmax` normalization: lower values sharpen distribution, higher values flatten it. |
-| freqai.extrema_weighting.rank_method | `average` | enum {`average`,`min`,`max`,`dense`,`ordinal`} | Ranking method for `rank` normalization. |
-| freqai.extrema_weighting.gamma | 1.0 | float (0,10] | Contrast exponent applied after normalization: >1 emphasizes extrema, values between 0 and 1 soften. |
-| _Feature parameters_ | | | |
-| freqai.feature_parameters.label_period_candles | min/max midpoint | int >= 1 | Zigzag labeling NATR horizon. |
-| freqai.feature_parameters.min_label_period_candles | 12 | int >= 1 | Minimum labeling NATR horizon used for reversals labeling HPO. |
-| freqai.feature_parameters.max_label_period_candles | 24 | int >= 1 | Maximum labeling NATR horizon used for reversals labeling HPO. |
-| freqai.feature_parameters.label_natr_ratio | min/max midpoint | float > 0 | Zigzag labeling NATR ratio. |
-| freqai.feature_parameters.min_label_natr_ratio | 9.0 | float > 0 | Minimum labeling NATR ratio used for reversals labeling HPO. |
-| freqai.feature_parameters.max_label_natr_ratio | 12.0 | float > 0 | Maximum labeling NATR ratio used for reversals labeling HPO. |
-| freqai.feature_parameters.label_frequency_candles | `auto` | int >= 2 \| `auto` | Reversals labeling frequency. `auto` = max(2, 2 \* number of whitelisted pairs). |
-| freqai.feature_parameters.label_metric | `euclidean` | string (supported: `euclidean`,`minkowski`,`cityblock`,`chebyshev`,`mahalanobis`,`seuclidean`,`jensenshannon`,`sqeuclidean`,...) | Metric used in distance calculations to ideal point. |
-| freqai.feature_parameters.label_weights | [1/6,1/6,1/6,1/6,1/6,1/6] | list[float] | Per-objective weights used in distance calculations to ideal point. Objectives: (1) number of detected reversals, (2) median swing amplitude, (3) median swing amplitude / median volatility-threshold ratio, (4) median swing volume, (5) median swing speed, (6) median swing efficiency ratio. |
-| freqai.feature_parameters.label_p_order | `None` | float \| None | p-order used by `minkowski` / `power_mean` (optional). |
-| freqai.feature_parameters.label_medoid_metric | `euclidean` | string | Metric used with `medoid`. |
-| freqai.feature_parameters.label_kmeans_metric | `euclidean` | string | Metric used for k-means clustering. |
-| freqai.feature_parameters.label_kmeans_selection | `min` | enum {`min`,`medoid`} | Strategy to select trial in the best kmeans cluster. |
-| freqai.feature_parameters.label_kmedoids_metric | `euclidean` | string | Metric used for k-medoids clustering. |
-| freqai.feature_parameters.label_kmedoids_selection | `min` | enum {`min`,`medoid`} | Strategy to select trial in the best k-medoids cluster. |
-| freqai.feature_parameters.label_knn_metric | `minkowski` | string | Distance metric for KNN. |
-| freqai.feature_parameters.label_knn_p_order | `None` | float \| None | Tunable for KNN neighbor distances aggregation methods: p-order (`knn_power_mean`, default: 1.0) or quantile (`knn_quantile`, default: 0.5). (optional) |
-| freqai.feature_parameters.label_knn_n_neighbors | 5 | int >= 1 | Number of neighbors for KNN. |
-| _Predictions extrema_ | | | |
-| freqai.predictions_extrema.selection_method | `rank` | enum {`rank`,`values`,`partition`} | Extrema selection method. `rank` uses ranked extrema values, `values` uses reversal values, `partition` uses sign-based partitioning. |
-| freqai.predictions_extrema.thresholds_smoothing | `mean` | enum {`mean`,`isodata`,`li`,`minimum`,`otsu`,`triangle`,`yen`,`median`,`soft_extremum`} | Thresholding method for prediction thresholds smoothing. |
-| freqai.predictions_extrema.thresholds_alpha | 12.0 | float > 0 | Alpha for `soft_extremum` thresholds smoothing. |
-| freqai.predictions_extrema.threshold_outlier | 0.999 | float (0,1) | Quantile threshold for predictions outlier filtering. |
-| freqai.predictions_extrema.extrema_fraction | 1.0 | float (0,1] | Fraction of extrema used for thresholds. `1.0` uses all, lower values keep only most significant. Applies to `rank` and `values`; ignored for `partition`. |
-| _Optuna / HPO_ | | | |
-| freqai.optuna_hyperopt.enabled | true | bool | Enables HPO. |
-| freqai.optuna_hyperopt.sampler | `tpe` | enum {`tpe`,`auto`} | HPO sampler algorithm. `tpe` uses [TPESampler](https://optuna.readthedocs.io/en/stable/reference/samplers/generated/optuna.samplers.TPESampler.html) with multivariate and group, `auto` uses [AutoSampler](https://hub.optuna.org/samplers/auto_sampler). |
-| freqai.optuna_hyperopt.storage | `file` | enum {`file`,`sqlite`} | HPO storage backend. |
-| freqai.optuna_hyperopt.continuous | true | bool | Continuous HPO. |
-| freqai.optuna_hyperopt.warm_start | true | bool | Warm start HPO with previous best value(s). |
-| freqai.optuna_hyperopt.n_startup_trials | 15 | int >= 0 | HPO startup trials. |
-| freqai.optuna_hyperopt.n_trials | 50 | int >= 1 | Maximum HPO trials. |
-| freqai.optuna_hyperopt.n_jobs | CPU threads / 4 | int >= 1 | Parallel HPO workers. |
-| freqai.optuna_hyperopt.timeout | 7200 | int >= 0 | HPO wall-clock timeout in seconds. |
-| freqai.optuna_hyperopt.label_candles_step | 1 | int >= 1 | Step for Zigzag NATR horizon search space. |
-| freqai.optuna_hyperopt.train_candles_step | 10 | int >= 1 | Step for training sets size search space. |
-| freqai.optuna_hyperopt.space_reduction | false | bool | Enable/disable HPO search space reduction based on previous best parameters. |
-| freqai.optuna_hyperopt.expansion_ratio | 0.4 | float [0,1] | HPO search space expansion ratio. |
-| freqai.optuna_hyperopt.min_resource | 3 | int >= 1 | Minimum resource per Hyperband pruner rung. |
-| freqai.optuna_hyperopt.seed | 1 | int >= 0 | HPO RNG seed. |
+| Path | Default | Type / Range | Description |
+| ---------------------------------------------------- | ----------------------------- | ------------------------------------------------------------------------------------------------------------------------------------------ | ----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
+| _Protections_ | | | |
+| custom_protections.trade_duration_candles | 72 | int >= 1 | Estimated trade duration in candles. Scales protections stop duration candles and trade limit. |
+| custom_protections.lookback_period_fraction | 0.5 | float (0,1] | Fraction of `fit_live_predictions_candles` used to calculate `lookback_period_candles` for _MaxDrawdown_ and _StoplossGuard_ protections. |
+| custom_protections.cooldown.enabled | true | bool | Enable/disable _CooldownPeriod_ protection. |
+| custom_protections.cooldown.stop_duration_candles | 4 | int >= 1 | Number of candles to wait before allowing new trades after a trade is closed. |
+| custom_protections.drawdown.enabled | true | bool | Enable/disable _MaxDrawdown_ protection. |
+| custom_protections.drawdown.max_allowed_drawdown | 0.2 | float (0,1) | Maximum allowed drawdown. |
+| custom_protections.stoploss.enabled | true | bool | Enable/disable _StoplossGuard_ protection. |
+| _Leverage_ | | | |
+| leverage | `proposed_leverage` | float [1.0, max_leverage] | Leverage. Fallback to `proposed_leverage` for the pair. |
+| _Exit pricing_ | | | |
+| exit_pricing.trade_price_target | `moving_average` | enum {`moving_average`,`interpolation`,`weighted_interpolation`} | Trade NATR computation method. |
+| exit_pricing.thresholds_calibration.decline_quantile | 0.90 | float (0,1) | PnL decline quantile threshold. |
+| _Reversal confirmation_ | | | |
+| reversal_confirmation.lookback_period | 0 | int >= 0 | Prior confirming candles; 0 = none. |
+| reversal_confirmation.decay_ratio | 0.5 | float (0,1] | Geometric per-candle volatility adjusted reversal threshold relaxation factor. |
+| reversal_confirmation.min_natr_ratio_percent | 0.0095 | float [0,1] | Lower bound fraction for volatility adjusted reversal threshold. |
+| reversal_confirmation.max_natr_ratio_percent | 0.075 | float [0,1] | Upper bound fraction (>= lower bound) for volatility adjusted reversal threshold. |
+| _Regressor model_ | | | |
+| freqai.regressor | `xgboost` | enum {`xgboost`,`lightgbm`} | Machine learning regressor algorithm. |
+| _Extrema smoothing_ | | | |
+| freqai.extrema_smoothing.method | `gaussian` | enum {`gaussian`,`kaiser`,`triang`,`smm`,`sma`,`savgol`,`nadaraya_watson`} | Extrema smoothing method (`smm`=median, `sma`=mean, `savgol`=Savitzky–Golay, `nadaraya_watson`=Gaussian kernel regression). |
+| freqai.extrema_smoothing.window | 5 | int >= 3 | Smoothing window length (candles). |
+| freqai.extrema_smoothing.beta | 8.0 | float > 0 | Shape parameter for `kaiser` kernel. |
+| freqai.extrema_smoothing.polyorder | 3 | int >= 1 | Polynomial order for `savgol` smoothing. |
+| freqai.extrema_smoothing.mode | `mirror` | enum {`mirror`,`constant`,`nearest`,`wrap`,`interp`} | Boundary mode for `savgol` and `nadaraya_watson`. |
+| freqai.extrema_smoothing.bandwidth | 1.0 | float > 0 | Gaussian bandwidth for `nadaraya_watson` smoothing. |
+| _Extrema weighting_ | | | |
+| freqai.extrema_weighting.strategy | `none` | enum {`none`,`amplitude`,`amplitude_threshold_ratio`,`volume_rate`,`speed`,`efficiency_ratio`,`volume_weighted_efficiency_ratio`,`hybrid`} | Extrema weighting source: unweighted (`none`), swing amplitude (`amplitude`), swing amplitude / median volatility-threshold ratio (`amplitude_threshold_ratio`), swing volume per candle (`volume_rate`), swing speed (`speed`), swing efficiency ratio (`efficiency_ratio`), swing volume-weighted efficiency ratio (`volume_weighted_efficiency_ratio`), or `hybrid`. |
+| freqai.extrema_weighting.source_weights | `{}` | dict[str, float] | Weights on extrema extrema weighting sources for `hybrid`. |
+| freqai.extrema_weighting.aggregation | `weighted_sum` | enum {`weighted_sum`,`geometric_mean`} | Aggregation method applied to weighted extrema weighting sources for `hybrid`. |
+| freqai.extrema_weighting.aggregation_normalization | `none` | enum {`minmax`,`sigmoid`,`softmax`,`l1`,`l2`,`rank`,`none`} | Normalization method applied to the aggregated extrema weighting source for `hybrid`. |
+| freqai.extrema_weighting.standardization | `none` | enum {`none`,`zscore`,`robust`,`mmad`} | Standardization method applied to weights before normalization. `none`=no standardization, `zscore`=(w-μ)/σ, `robust`=(w-median)/IQR, `mmad`=(w-median)/MAD. |
+| freqai.extrema_weighting.robust_quantiles | [0.25, 0.75] | list[float] where 0 <= Q1 < Q3 <= 1 | Quantile range for robust standardization, Q1 and Q3. |
+| freqai.extrema_weighting.mmad_scaling_factor | 1.4826 | float > 0 | Scaling factor for MMAD standardization. |
+| freqai.extrema_weighting.normalization | `minmax` | enum {`minmax`,`sigmoid`,`softmax`,`l1`,`l2`,`rank`,`none`} | Normalization method applied to weights. |
+| freqai.extrema_weighting.minmax_range | [0.0, 1.0] | list[float] | Target range for `minmax` normalization, min and max. |
+| freqai.extrema_weighting.sigmoid_scale | 1.0 | float > 0 | Scale parameter for `sigmoid` normalization, controls steepness. |
+| freqai.extrema_weighting.softmax_temperature | 1.0 | float > 0 | Temperature parameter for `softmax` normalization: lower values sharpen distribution, higher values flatten it. |
+| freqai.extrema_weighting.rank_method | `average` | enum {`average`,`min`,`max`,`dense`,`ordinal`} | Ranking method for `rank` normalization. |
+| freqai.extrema_weighting.gamma | 1.0 | float (0,10] | Contrast exponent applied after normalization: >1 emphasizes extrema, values between 0 and 1 soften. |
+| _Feature parameters_ | | | |
+| freqai.feature_parameters.label_period_candles | min/max midpoint | int >= 1 | Zigzag labeling NATR horizon. |
+| freqai.feature_parameters.min_label_period_candles | 12 | int >= 1 | Minimum labeling NATR horizon used for reversals labeling HPO. |
+| freqai.feature_parameters.max_label_period_candles | 24 | int >= 1 | Maximum labeling NATR horizon used for reversals labeling HPO. |
+| freqai.feature_parameters.label_natr_ratio | min/max midpoint | float > 0 | Zigzag labeling NATR ratio. |
+| freqai.feature_parameters.min_label_natr_ratio | 9.0 | float > 0 | Minimum labeling NATR ratio used for reversals labeling HPO. |
+| freqai.feature_parameters.max_label_natr_ratio | 12.0 | float > 0 | Maximum labeling NATR ratio used for reversals labeling HPO. |
+| freqai.feature_parameters.label_frequency_candles | `auto` | int >= 2 \| `auto` | Reversals labeling frequency. `auto` = max(2, 2 \* number of whitelisted pairs). |
+| freqai.feature_parameters.label_metric | `euclidean` | string (supported: `euclidean`,`minkowski`,`cityblock`,`chebyshev`,`mahalanobis`,`seuclidean`,`jensenshannon`,`sqeuclidean`,...) | Metric used in distance calculations to ideal point. |
+| freqai.feature_parameters.label_weights | [1/7,1/7,1/7,1/7,1/7,1/7,1/7] | list[float] | Per-objective weights used in distance calculations to ideal point. Objectives: (1) number of detected reversals, (2) median swing amplitude, (3) median swing amplitude / median volatility-threshold ratio, (4) median swing volume per candle, (5) median swing speed, (6) median swing efficiency ratio, (7) median swing volume-weighted efficiency ratio. |
+| freqai.feature_parameters.label_p_order | `None` | float \| None | p-order used by `minkowski` / `power_mean` (optional). |
+| freqai.feature_parameters.label_medoid_metric | `euclidean` | string | Metric used with `medoid`. |
+| freqai.feature_parameters.label_kmeans_metric | `euclidean` | string | Metric used for k-means clustering. |
+| freqai.feature_parameters.label_kmeans_selection | `min` | enum {`min`,`medoid`} | Strategy to select trial in the best kmeans cluster. |
+| freqai.feature_parameters.label_kmedoids_metric | `euclidean` | string | Metric used for k-medoids clustering. |
+| freqai.feature_parameters.label_kmedoids_selection | `min` | enum {`min`,`medoid`} | Strategy to select trial in the best k-medoids cluster. |
+| freqai.feature_parameters.label_knn_metric | `minkowski` | string | Distance metric for KNN. |
+| freqai.feature_parameters.label_knn_p_order | `None` | float \| None | Tunable for KNN neighbor distances aggregation methods: p-order (`knn_power_mean`, default: 1.0) or quantile (`knn_quantile`, default: 0.5). (optional) |
+| freqai.feature_parameters.label_knn_n_neighbors | 5 | int >= 1 | Number of neighbors for KNN. |
+| _Predictions extrema_ | | | |
+| freqai.predictions_extrema.selection_method | `rank` | enum {`rank`,`values`,`partition`} | Extrema selection method. `rank` uses ranked extrema values, `values` uses ranked reversal values, `partition` uses sign-based partitioning. |
+| freqai.predictions_extrema.thresholds_smoothing | `mean` | enum {`mean`,`isodata`,`li`,`minimum`,`otsu`,`triangle`,`yen`,`median`,`soft_extremum`} | Thresholding method for prediction thresholds smoothing. |
+| freqai.predictions_extrema.thresholds_alpha | 12.0 | float > 0 | Alpha for `soft_extremum` thresholds smoothing. |
+| freqai.predictions_extrema.threshold_outlier | 0.999 | float (0,1) | Quantile threshold for predictions outlier filtering. |
+| freqai.predictions_extrema.extrema_fraction | 1.0 | float (0,1] | Fraction of extrema used for thresholds. `1.0` uses all, lower values keep only most significant. Applies to `rank` and `values`; ignored for `partition`. |
+| _Optuna / HPO_ | | | |
+| freqai.optuna_hyperopt.enabled | true | bool | Enables HPO. |
+| freqai.optuna_hyperopt.sampler | `tpe` | enum {`tpe`,`auto`} | HPO sampler algorithm. `tpe` uses [TPESampler](https://optuna.readthedocs.io/en/stable/reference/samplers/generated/optuna.samplers.TPESampler.html) with multivariate and group, `auto` uses [AutoSampler](https://hub.optuna.org/samplers/auto_sampler). |
+| freqai.optuna_hyperopt.storage | `file` | enum {`file`,`sqlite`} | HPO storage backend. |
+| freqai.optuna_hyperopt.continuous | true | bool | Continuous HPO. |
+| freqai.optuna_hyperopt.warm_start | true | bool | Warm start HPO with previous best value(s). |
+| freqai.optuna_hyperopt.n_startup_trials | 15 | int >= 0 | HPO startup trials. |
+| freqai.optuna_hyperopt.n_trials | 50 | int >= 1 | Maximum HPO trials. |
+| freqai.optuna_hyperopt.n_jobs | CPU threads / 4 | int >= 1 | Parallel HPO workers. |
+| freqai.optuna_hyperopt.timeout | 7200 | int >= 0 | HPO wall-clock timeout in seconds. |
+| freqai.optuna_hyperopt.label_candles_step | 1 | int >= 1 | Step for Zigzag NATR horizon search space. |
+| freqai.optuna_hyperopt.train_candles_step | 10 | int >= 1 | Step for training sets size search space. |
+| freqai.optuna_hyperopt.space_reduction | false | bool | Enable/disable HPO search space reduction based on previous best parameters. |
+| freqai.optuna_hyperopt.expansion_ratio | 0.4 | float [0,1] | HPO search space expansion ratio. |
+| freqai.optuna_hyperopt.min_resource | 3 | int >= 1 | Minimum resource per Hyperband pruner rung. |
+| freqai.optuna_hyperopt.seed | 1 | int >= 0 | HPO RNG seed. |
## ReforceXY
"none",
"amplitude",
"amplitude_threshold_ratio",
- "volume",
+ "volume_rate",
"speed",
"efficiency_ratio",
+ "volume_weighted_efficiency_ratio",
"hybrid",
]
WEIGHT_STRATEGIES: Final[tuple[WeightStrategy, ...]] = (
"none",
"amplitude",
"amplitude_threshold_ratio",
- "volume",
+ "volume_rate",
"speed",
"efficiency_ratio",
+ "volume_weighted_efficiency_ratio",
"hybrid",
)
HybridWeightSource = Literal[
"amplitude",
"amplitude_threshold_ratio",
- "volume",
+ "volume_rate",
"speed",
"efficiency_ratio",
+ "volume_weighted_efficiency_ratio",
]
HYBRID_WEIGHT_SOURCES: Final[tuple[HybridWeightSource, ...]] = (
"amplitude",
"amplitude_threshold_ratio",
- "volume",
+ "volume_rate",
"speed",
"efficiency_ratio",
+ "volume_weighted_efficiency_ratio",
)
HybridAggregation = Literal["weighted_sum", "geometric_mean"]
indices: list[int],
amplitudes: list[float],
amplitude_threshold_ratios: list[float],
- volumes: list[float],
+ volume_rates: list[float],
speeds: list[float],
efficiency_ratios: list[float],
+ volume_weighted_efficiency_ratios: list[float],
source_weights: dict[str, float],
aggregation: HybridAggregation = DEFAULTS_EXTREMA_WEIGHTING["aggregation"],
aggregation_normalization: NormalizationType = DEFAULTS_EXTREMA_WEIGHTING[
"amplitude_threshold_ratio": np.asarray(
amplitude_threshold_ratios, dtype=float
),
- "volume": np.asarray(volumes, dtype=float),
+ "volume_rate": np.asarray(volume_rates, dtype=float),
"speed": np.asarray(speeds, dtype=float),
"efficiency_ratio": np.asarray(efficiency_ratios, dtype=float),
+ "volume_weighted_efficiency_ratio": np.asarray(
+ volume_weighted_efficiency_ratios, dtype=float
+ ),
}
enabled_sources: list[HybridWeightSource] = []
indices: list[int],
amplitudes: list[float],
amplitude_threshold_ratios: list[float],
- volumes: list[float],
+ volume_rates: list[float],
speeds: list[float],
efficiency_ratios: list[float],
+ volume_weighted_efficiency_ratios: list[float],
source_weights: dict[str, float],
strategy: WeightStrategy = DEFAULTS_EXTREMA_WEIGHTING["strategy"],
aggregation: HybridAggregation = DEFAULTS_EXTREMA_WEIGHTING["aggregation"],
WEIGHT_STRATEGIES[3],
WEIGHT_STRATEGIES[4],
WEIGHT_STRATEGIES[5],
+ WEIGHT_STRATEGIES[6],
}
- ): # "amplitude" / "amplitude_threshold_ratio" / "volume" / "speed" / "efficiency_ratio"
+ ): # "amplitude" / "amplitude_threshold_ratio" / "volume_rate" / "speed" / "efficiency_ratio" / "volume_weighted_efficiency_ratio"
if strategy == WEIGHT_STRATEGIES[1]: # "amplitude"
weights = np.asarray(amplitudes, dtype=float)
elif strategy == WEIGHT_STRATEGIES[2]: # "amplitude_threshold_ratio"
weights = np.asarray(amplitude_threshold_ratios, dtype=float)
- elif strategy == WEIGHT_STRATEGIES[3]: # "volume"
- weights = np.asarray(volumes, dtype=float)
+ elif strategy == WEIGHT_STRATEGIES[3]: # "volume_rate"
+ weights = np.asarray(volume_rates, dtype=float)
elif strategy == WEIGHT_STRATEGIES[4]: # "speed"
weights = np.asarray(speeds, dtype=float)
elif strategy == WEIGHT_STRATEGIES[5]: # "efficiency_ratio"
weights = np.asarray(efficiency_ratios, dtype=float)
+ elif strategy == WEIGHT_STRATEGIES[6]: # "volume_weighted_efficiency_ratio"
+ weights = np.asarray(volume_weighted_efficiency_ratios, dtype=float)
else:
weights = np.asarray([], dtype=float)
gamma=gamma,
)
- if strategy == WEIGHT_STRATEGIES[6]: # "hybrid"
+ if strategy == WEIGHT_STRATEGIES[7]: # "hybrid"
normalized_weights = calculate_hybrid_extrema_weights(
indices=indices,
amplitudes=amplitudes,
amplitude_threshold_ratios=amplitude_threshold_ratios,
- volumes=volumes,
+ volume_rates=volume_rates,
speeds=speeds,
efficiency_ratios=efficiency_ratios,
+ volume_weighted_efficiency_ratios=volume_weighted_efficiency_ratios,
source_weights=source_weights,
aggregation=aggregation,
aggregation_normalization=aggregation_normalization,
indices: list[int],
amplitudes: list[float],
amplitude_threshold_ratios: list[float],
- volumes: list[float],
+ volume_rates: list[float],
speeds: list[float],
efficiency_ratios: list[float],
+ volume_weighted_efficiency_ratios: list[float],
source_weights: dict[str, float],
strategy: WeightStrategy = DEFAULTS_EXTREMA_WEIGHTING["strategy"],
aggregation: HybridAggregation = DEFAULTS_EXTREMA_WEIGHTING["aggregation"],
indices=indices,
amplitudes=amplitudes,
amplitude_threshold_ratios=amplitude_threshold_ratios,
- volumes=volumes,
+ volume_rates=volume_rates,
speeds=speeds,
efficiency_ratios=efficiency_ratios,
+ volume_weighted_efficiency_ratios=volume_weighted_efficiency_ratios,
source_weights=source_weights,
strategy=strategy,
aggregation=aggregation,
pivots_directions: list[TrendDirection] = []
pivots_amplitudes: list[float] = []
pivots_amplitude_threshold_ratios: list[float] = []
- pivots_volumes: list[float] = []
- pivots_durations: list[float] = []
+ pivots_volume_rates: list[float] = []
pivots_speeds: list[float] = []
pivots_efficiency_ratios: list[float] = []
+ pivots_volume_weighted_efficiency_ratios: list[float] = []
last_pivot_pos: int = -1
candidate_pivot_pos: int = -1
return amplitude, amplitude_threshold_ratio
- def calculate_pivot_volume(
+ def calculate_pivot_duration(
+ *,
+ previous_pos: int,
+ current_pos: int,
+ ) -> float:
+ if previous_pos < 0 or current_pos < 0:
+ return np.nan
+ if previous_pos >= n or current_pos >= n:
+ return np.nan
+
+ return float(abs(current_pos - previous_pos))
+
+ def calculate_pivot_volume_rate(
*,
previous_pos: int,
current_pos: int,
if previous_pos >= n or current_pos >= n:
return np.nan
+ duration = calculate_pivot_duration(
+ previous_pos=previous_pos,
+ current_pos=current_pos,
+ )
+
+ if not np.isfinite(duration) or duration == 0:
+ return np.nan
+
start_pos = min(previous_pos, current_pos)
end_pos = max(previous_pos, current_pos) + 1
- return np.nansum(volumes[start_pos:end_pos])
+ total_volume = np.nansum(volumes[start_pos:end_pos])
+ return total_volume / duration
- def calculate_pivot_duration(
+ def calculate_pivot_speed(
*,
previous_pos: int,
current_pos: int,
+ amplitude: float,
) -> float:
if previous_pos < 0 or current_pos < 0:
return np.nan
if previous_pos >= n or current_pos >= n:
return np.nan
+ if not np.isfinite(amplitude):
+ return np.nan
- return abs(current_pos - previous_pos)
+ duration = calculate_pivot_duration(
+ previous_pos=previous_pos,
+ current_pos=current_pos,
+ )
+
+ if not np.isfinite(duration) or duration == 0:
+ return np.nan
+
+ return amplitude / duration
def calculate_pivot_efficiency_ratio(
*,
return net_move / path_length
+ def calculate_pivot_volume_weighted_efficiency_ratio(
+ *,
+ previous_pos: int,
+ current_pos: int,
+ ) -> float:
+ if previous_pos < 0 or current_pos < 0:
+ return np.nan
+ if previous_pos >= n or current_pos >= n:
+ return np.nan
+
+ start_pos = min(previous_pos, current_pos)
+ end_pos = max(previous_pos, current_pos) + 1
+ if (end_pos - start_pos) < 2:
+ return np.nan
+
+ closes_slice = closes[start_pos:end_pos]
+ close_diffs = np.diff(closes_slice)
+ net_move = float(abs(closes_slice[-1] - closes_slice[0]))
+
+ volumes_slice = volumes[start_pos + 1 : end_pos]
+ total_volume = np.nansum(volumes_slice)
+ if not np.isfinite(total_volume) or np.isclose(total_volume, 0.0):
+ return np.nan
+
+ volume_weights = volumes_slice / total_volume
+
+ vw_path_length = float(np.nansum(np.abs(close_diffs) * volume_weights))
+
+ if not (np.isfinite(vw_path_length) and np.isfinite(net_move)):
+ return np.nan
+ if np.isclose(vw_path_length, 0.0):
+ return np.nan
+
+ return net_move / vw_path_length
+
def add_pivot(pos: int, value: float, direction: TrendDirection):
nonlocal last_pivot_pos
if pivots_indices and indices[pos] == pivots_indices[-1]:
current_value=value,
)
)
- volume = calculate_pivot_volume(
+ volume_rate = calculate_pivot_volume_rate(
previous_pos=last_pivot_pos,
current_pos=pos,
)
- duration = calculate_pivot_duration(
+ speed = calculate_pivot_speed(
previous_pos=last_pivot_pos,
current_pos=pos,
+ amplitude=amplitude,
)
- if np.isfinite(amplitude) and np.isfinite(duration) and duration > 0.0:
- speed = amplitude / duration
- else:
- speed = np.nan
efficiency_ratio = calculate_pivot_efficiency_ratio(
previous_pos=last_pivot_pos,
current_pos=pos,
)
+ volume_weighted_efficiency_ratio = (
+ calculate_pivot_volume_weighted_efficiency_ratio(
+ previous_pos=last_pivot_pos,
+ current_pos=pos,
+ )
+ )
else:
amplitude = np.nan
amplitude_threshold_ratio = np.nan
- volume = np.nan
- duration = np.nan
+ volume_rate = np.nan
speed = np.nan
efficiency_ratio = np.nan
+ volume_weighted_efficiency_ratio = np.nan
pivots_amplitudes.append(amplitude)
pivots_amplitude_threshold_ratios.append(amplitude_threshold_ratio)
- pivots_volumes.append(volume)
- pivots_durations.append(duration)
+ pivots_volume_rates.append(volume_rate)
pivots_speeds.append(speed)
pivots_efficiency_ratios.append(efficiency_ratio)
+ pivots_volume_weighted_efficiency_ratios.append(
+ volume_weighted_efficiency_ratio
+ )
last_pivot_pos = pos
reset_candidate_pivot()
pivots_directions,
pivots_amplitudes,
pivots_amplitude_threshold_ratios,
- pivots_volumes,
- pivots_durations,
+ pivots_volume_rates,
pivots_speeds,
pivots_efficiency_ratios,
+ pivots_volume_weighted_efficiency_ratios,
)