Releases: ToluClassics/mlx-transformers
Releases · ToluClassics/mlx-transformers
v0.1.4
What's Changed
- Add Fuyu and Persimmon by @ToluClassics in #15
- This PR fixes the Cache used in Generation models by @ToluClassics in #16
- add openelm by @ToluClassics in #17
Full Changelog: v0.1.3...v0.1.4
v0.1.3
What's Changed
- Add Phi 2.0 and 3.0 by @ToluClassics in #8
- Add llama by @ToluClassics in #3
- Add Bert Classification Model by @ToluClassics in #2
- Implemented
RobertaForSequenceClassification
by @Seun-Ajayi in #1 - Implemented Bert sub-tasks by @Seun-Ajayi in #4
- Add a mixing for loading models directly from Huggingface by @ToluClassics in #6
- Code Refactor and NLLB Example by @ToluClassics in #7
- Implemented
XLMRoberta
sub-tasks by @Seun-Ajayi in #5 - Version 0.1.3 by @ToluClassics in #9
New Contributors
- @ToluClassics made their first contribution in #2
- @Seun-Ajayi made their first contribution in #1
Full Changelog: V0.0.1-pre-release...v0.1.3
V0.0.1-pre-release
Pre release MLX Transformers is a library that provides model implementation in MLX. It uses a similar model interface as HuggingFace Transformers and provides a way to load and run models in Apple Silicon devices with a few model implementations.
Full Changelog: https://github.com/ToluClassics/mlx-transformers/commits/V0.0.1-pre-release