this project is an experiment to predict the prices for bandwidth reservation from different Network operators at a given moment using deep learning techniques. By harnessing the power of LSTM and Transformer architectures, this solution achieves high accuracy in its predictions, catering to diverse datasets and time series complexities.
- LSTM Model: Utilizes Long Short-Term Memory networks, a type of recurrent neural network, for handling long-term dependencies in time series data.
- Transformer Model: Employs the Transformer architecture, known for its efficiency in sequence transduction tasks.
- datasets/: Contains the raw data files.
- models/: Contains the neural network architectures for both LSTM and Transformer models.
- config/: Configuration settings and hyperparameters.
- utils/: Utility functions for data processing and batching.
- trainers/: Training logic for the neural network models.
- visualisations/: Scripts and functions for visualizing training results.
- out/: Contains trained model and measurements for visualization
All the dependencies required for this project are listed in the pyproject.toml
file for those using Poetry, and in the requirements.txt
file for pip users.
- If not installed, get Poetry.
- From the project's root directory, install the dependencies:
poetry install
For those not using Poetry, dependencies can also be installed using pip:
pip install -r requirements.txt
- Model Training: Execute the paper.py script to train both the LSTM and Transformer models:
python paper.py
- Result Visualization:
After training, you can visualize the results by executing the plot.py script:
python plot.py
you can also use poetry, eg: poetry run python paper.py
Once the models are trained, you'll find saved models and performance metrics in the out/ directory. Visualizations generated will provide insights into model performance, including loss and accuracy over time.
This project is licensed under the MIT License - see the LICENSE.md file for details.