New Release v0.1.6
New Version Release Highlights:
-
Addition of New Models: We've expanded our model suite to include the following architectures:
- FT-Transformer: Leverages transformer encoders for improved performance on tabular data.
- MLP (Multi-Layer Perceptron): A classical deep learning model for handling a wide range of tabular data tasks.
- ResNet: Adapted from the classical ResNet architecture and proven to be a good baseline for tabular tasks.
- TabTransformer: Utilizes transformer-based models for categorical features.
-
Bidirectional and Feature Interaction Capabilities: Mambular now includes bidirectional capabilities and enhanced feature interaction mechanisms, enabling more complex and dynamic data representations and improving model accuracy.
-
Architectural Restructuring: The internal architecture has been restructured to facilitate the easy integration of new models. This modular approach simplifies the process of extending Mambular with custom models.
-
New Preprocessing Methods: We have introduced new preprocessing techniques to better prepare your data for modeling:
- Quantile Preprocessing: Transforms numerical features to follow a uniform or normal distribution, improving robustness to outliers.
- Polynomial Features: Generates polynomial and interaction features to capture more complex relationships within the data.
- Spline Transformation: Applies piecewise polynomial functions to numerical features, effectively capturing nonlinear relationships.