- Drop support for Julia v0.4.
- Added support for NVVM.
- Updated supported version of MXNet to 0.9.3.
- New optimizers (@Arkoniak).
- Track specific libmxnet version for each release.
- Migrated documentation system to
Documenter.jl
(@vchuravy) - Simplified building by using Julia's OpenBlas (@staticfloat)
- Freezing parameters (@vchuravy)
- Support
DType
forNDArray
(@vchuravy)
- Fix compatability with Julia v0.5.
- Fix seg-faults introduced by upstream API changes.
- Fix compatability with Julia v0.4.2 (@BigEpsilon)
- Metrics in epoch callbacks (@kasiabozek)
- Variants of Xaiver initializers (@vchuravy)
- More arithmetic operators on symbolic nodes
- Basic interface for symbolic node attributes (@vchuravy)
- char-lstm example.
- Network visualization via GraphViz.
- NN-factory for common models.
- Convenient
@nd_as_jl
macro to work withNDArray
as Julia Arrays. - Refactoring:
Symbol
->SymbolicNode
. - More evaluation metrics (@vchuravy, @Andy-P)
- ADAM optimizer (@cbecker)
- Improved data provider API.
- More documentation.
- Fix a bug in array data iterator (@vchuravy)
- Model prediction API.
- Model checkpoint loading and saving.
- IJulia Notebook example of using pre-trained imagenet model as classifier.
- Symbol saving and loading.
- NDArray saving and loading.
- Optimizer gradient clipping.
- Model training callback APIs, default checkpoint and speedometer callbacks.
- Julia Array / NDArray data iterator.
- Sphinx documentation system and documents for dynamically imported libmxnet APIs.
- Fix a bug in build script that causes Julia REPL to exit.
Initial release.
- Basic libmxnet API.
- Basic documentation, overview and MNIST tutorial.
- Working MNIST and cifar-10 examples, with multi-GPU training.
- Automatic building of libmxnet with BinDeps.jl.