Skip to content

Rust crate implementing gradient based stochastic optimizers

License

Apache-2.0, MIT licenses found

Licenses found

Apache-2.0
LICENSE-APACHE
MIT
LICENSE-MIT
Notifications You must be signed in to change notification settings

benjamin-lieser/stochastic_optimizers

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

18 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

stochastic_optimizers

This crate provides implementations of common stochstic gradient optimization algorithms. They are designed to be lightweight, flexible and easy to use.

Currently implemted:

  • Adam
  • SGD
  • AdaGrad

The crate does not provide automatic differentiation, the gradient is given by the user.

Examples

use stochastic_optimizers::{Adam, Optimizer};
//minimise the function (x-4)^2
let start = -3.0;
let mut optimizer = Adam::new(start, 0.1);

for _ in 0..10000 {
   let current_paramter = optimizer.parameters();

   // d/dx (x-4)^2
   let gradient = 2.0 * current_paramter - 8.0;

   optimizer.step(&gradient);
}

assert_eq!(optimizer.into_parameters(), 4.0);

The parameters are owned by the optimizer and a reference can be optained by parameters(). After optimization they can be optained by into_parameters().

What types can be optimized

All types which impement the Parameters trait can be optimized. Implementations for the standart types f32, f64, Vec<T : Parameters> and [T : Parameters ; N] are provided.

Its realativly easy to implement it for custom types, see Parameters.

ndarray

By enabling the ndarray feature you can use Array as Parameters

Unit tests

The unit tests require libtorch via the tch crate. See github for installation details.

License

Licensed under either of

at your option.

Contribution

Unless you explicitly state otherwise, any contribution intentionally submitted for inclusion in the work by you, as defined in the Apache-2.0 license, shall be dual licensed as above, without any additional terms or conditions.

About

Rust crate implementing gradient based stochastic optimizers

Resources

License

Apache-2.0, MIT licenses found

Licenses found

Apache-2.0
LICENSE-APACHE
MIT
LICENSE-MIT

Stars

Watchers

Forks

Packages

No packages published

Languages