- Add EarlyStopping callback to LSTM and GRU model training
- Monitor validation loss with patience of 10 epochs
- Automatically stop training when performance plateaus
- Restore best weights found during training
- Implement dropout layers with 0.2 rate after each LSTM/GRU layer
- Aim to improve model generalization and prediction accuracy
- Modify create_lstm_model and create_gru_model functions
- Added L1 and L2 regularization to LSTM and GRU models to reduce overfitting.
- Modified `create_lstm_model()` and `create_gru_model()` functions to include regularization.
- Updated `analyze_and_predict_stock()` function to use augmented data for training.
- Improved `calculate_ensemble_weights()` function to handle cases where all model scores are zero or negative.
- Resolved ZeroDivisionError in `calculate_ensemble_weights()` function by adding a check for zero sum of weights.
- Add walk-forward optimization for more realistic model evaluation
- Introduce robust feature selection using RFE
- Implement time series cross-validation for hyperparameter tuning
- Adjust ensemble weighting based on out-of-sample performance
- Add L1/L2 regularization to LSTM and GRU models
- Implement simple data augmentation for time series
- Enhance feature engineering with additional relevant indicators
These changes aim to reduce overfitting and improve the model's
generalization to unseen data, resulting in more reliable predictions.
- Added feature selection using SelectKBest with f_regression
- Introduced ensemble weighting based on validation performance
- Implemented a function to calculate and display overfitting scores
- Added more regularization parameters for Random Forest and XGBoost in hyperparameter tuning
- Modified `prepare_data` function to create X and y with a one-step offset for proper time series forecasting
- Updated Random Forest and XGBoost prediction in `train_and_evaluate_model` to predict one step ahead
- Modified `ensemble_predict` and `weighted_ensemble_predict` functions to predict one step ahead
- Updated hyperparameter tuning for Random Forest and XGBoost to use TimeSeriesSplit for cross-validation
- Adjusted the quick test mode to use a smaller parameter space and fewer iterations for faster tuning
- Corrected `calculate_ensemble_weights` function to properly handle model tuples
- Ensured consistent one-step-ahead prediction across all models and evaluation steps
- Enhanced the README with more detailed information about recent modifications and tool capabilities
- Optimized hyperparameter tuning process, resulting in significantly faster tuning times for Random Forest and XGBoost models
- Implemented early stopping for neural networks to prevent overfitting.
- Added more technical indicators for enhanced feature engineering.
- Introduced a weighted ensemble method for combining model predictions.
- Implemented a simple trading strategy based on predictions.
- Enhanced visualization to include trading strategy performance.