Releases: google/jaxopt
Releases · google/jaxopt
jaxopt-v0.4.1
Bug fixes and enhancements
- Improvements in jaxopt.LBFGS: fixed bug when using use_gamma=True, added stepsize option, strengthened tests, by Mathieu Blondel.
- Fixed link in resnet notebook, by Fabian Pedregosa.
Contributors
Fabian Pedregosa, Mathieu Blondel.
jaxopt-v0.4
New features
- Added solver jaxopt.LevenbergMarquardt, by Amir Saadat.
- Added solver jaxopt.BoxCDQP, by Mathieu Blondel.
- Added projection_hypercube, by Mathieu Blondel.
Bug fixes and enhancements
- Fixed solve_normal_cg when the linear operator is “nonsquare” (does not map to a space of same dimension), by Mathieu Blondel.
- Fixed edge case in jaxopt.Bisection, by Mathieu Blondel.
- Replaced deprecated tree_multimap with tree_map, by Fan Yang.
- Added support for leaf cond pytrees in tree_where, by Felipe Llinares.
- Added Python 3.10 support officially, by Jeppe Klitgaard.
- In scipy wrappers, converted pytree leaves to jax arrays to determine their shape/dtype, by Roy Frostig.
- Converted the “Resnet” and “Adversarial Training” examples to notebooks, by Fabian Pedregosa.
Contributors
Amir Saadat, Fabian Pedregosa, Fan Yang, Felipe Llinares, Jeppe Klitgaard, Mathieu Blondel, Roy Frostig.
jaxopt-v0.3.1
New features
- Pjit-based example of data parallel training using Flax, by Felipe Llinares.
Bug fixes and enhancements
- Support for GPU and state of the art adversarial training algorithm (PGD) on the robust_training.py example, by Fabian Pedregosa.
- Update line search in LBFGS to use jit and unroll from LBFGS, by Ian Williamson.
- Support dynamic maximum iteration count in iterative solvers, by Roy Frostig.
- Fix tree_where for singleton pytrees, by Louis Béthune.
- Remove QuadraticProg in projections and set
init_params=None
by default in QP solvers, by Louis Béthune. - Add missing 'value' attribute in LbfgsState, by Mathieu Blondel.
Contributors
Felipe Llinares, Fabian Pedregosa, Ian Williamson, Louis Bétune, Mathieu Blondel, Roy Frostig.
jaxopt-v0.3
New features
- jaxopt.LBFGS
- jaxopt.BacktrackingLineSearch
- jaxopt.GaussNewton
- jaxopt.NonlinearCG
Bug fixes and enhancements
- Support implicit AD in higher-order differentiation.
Contributors
Amir Saadat, Fabian Pedregosa, Geoffrey Négiar, Hyunsung Lee, Mathieu Blondel, Roy Frostig.
jaxopt-v0.2
New features
- Quadratic programming solvers jaxopt.CvxpyQP, jaxopt.OSQP, jaxopt.BoxOSQP and jaxopt.EqualityConstrainedQP
- Iterative refinement
New examples
- Resnet example with Flax and JAXopt.
Bug fixes and enhancements
- Prevent recompilation of loops in solver.run if executing without jit.
- Prevents recomputation of gradient in OptaxSolver.
- Make solver.update jittable and ensure output states are consistent.
- Allow Callable for the stepsize argument in jaxopt.ProximalGradient, jaxopt.ProjectedGradient and jaxopt.GradientDescent.
Deprecated features
- jaxopt.QuadraticProgramming is deprecated and will be removed in v0.3. Use jaxopt.CvxpyQP, jaxopt.OSQP, jaxopt.BoxOSQP and jaxopt.EqualityConstrainedQP instead.
Contributors
Fabian Pedregosa, Felipe Llinares, Geoffrey Negiar, Louis Bethune, Mathieu Blondel, Vikas Sindhwani.
jaxopt-v0.1.1
New features
- Added solver jaxopt.ArmijoSGD
- Added example Deep Equilibrium (DEQ) model in Flax with Anderson acceleration.
- Added example Comparison of different SGD algorithms.
Bug fixes
- Allow non-jittable proximity operators in jaxopt.ProximalGradient
- Raise an exception if a quadratic program is infeasible or unbounded
Contributors
Fabian Pedregosa, Louis Bethune, Mathieu Blondel.
jaxopt-v0.1
Classes
- jaxopt.AndersonAcceleration
- jaxopt.AndersonWrapper
- jaxopt.Bisection
- jaxopt.BlockCoordinateDescent
- jaxopt.FixedPointIteration
- jaxopt.GradientDescent
- jaxopt.MirrorDescent
- jaxopt.OptaxSolver
- jaxopt.PolyakSGD
- jaxopt.ProjectedGradient
- jaxopt.ProximalGradient
- jaxopt.QuadraticProgramming
- jaxopt.ScipyBoundedLeastSquares
- jaxopt.ScipyBoundedMinimize
- jaxopt.ScipyLeastSquares
- jaxopt.ScipyMinimize
- jaxopt.ScipyRootFinding
- Implicit differentiation
Examples
- Binary kernel SVM with intercept.
- Image classification example with Flax and JAXopt.
- Image classification example with Haiku and JAXopt.
- VAE example with Haiku and JAXopt.
- Implicit differentiation of lasso.
- Multiclass linear SVM (without intercept).
- Non-negative matrix factorizaton (NMF) using alternating minimization.
- Dataset distillation.
- Implicit differentiation of ridge regression.
- Robust training.
- Anderson acceleration of gradient descent.
- Anderson acceleration of block coordinate descent.
- Anderson acceleration in application to Picard–Lindelöf theorem.
Contributors
Fabian Pedregosa, Felipe Llinares, Robert Gower, Louis Bethune, Marco Cuturi, Mathieu Blondel, Peter Hawkins, Quentin Berthet, Roy Frostig, Ta-Chu Kao