Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add Minuit optimiser and update NRSource #40

Merged
merged 15 commits into from
Oct 30, 2019
Merged

Conversation

CristianAntochi
Copy link
Collaborator

In this pull request I have added the Minuit optimiser and updated the NRSource for SR1.

Further more as discussed in issue #36 I have taken out the scaling of the parameters in the optimiser since it seemed to give problems with the error computation.

The Minuit optimiser needs some polishing but from testing seems to work very well and the computation time penalty doesn't seem to be too great (roughly a factor 2) with respect to the tensorflow optimiser.

The main advantage of Minuit comes from the fact that bounds can be given to the parameters and this avoids the useless stepping into negative parameter values and nan-s from the objective function.

Possible improvements to this implementation are:

  • give automatic bounds to the parameters and not leave them as keyword arguments
  • give initial step size as a percentage of the guess params
  • figure out how the precision of migrad works and give an appropriate value as default (and not necessarily use the internal migrad precision)
  • compute and return the Hesse errors of Minuit and compare with our implementation

@CristianAntochi
Copy link
Collaborator Author

I know why the tests don't pass. Tomorrow I will change the objective function to make it work!

@jonas-eschle
Copy link

Hey, that looks nice! About minuit:

  • Minuit does a minimal amount of steps. If the minimum is found trivially, minuit will in general be slower (but this should not be a problem)
  • limits: they should not be used if possible, only to find good starting values, since they perform a transformation. If a value is close to the bounds, the fit result is not reliable in general. In general, since Minuit is a local optimizer, a successful fit should start with good initial values.

On another note, I've opened #41 to discuss the broader scope of minimizers and their implementation

@CristianAntochi
Copy link
Collaborator Author

Hey, that looks nice! About minuit:

  • Minuit does a minimal amount of steps. If the minimum is found trivially, minuit will in general be slower (but this should not be a problem)
  • limits: they should not be used if possible, only to find good starting values, since they perform a transformation. If a value is close to the bounds, the fit result is not reliable in general. In general, since Minuit is a local optimizer, a successful fit should start with good initial values.

On another note, I've opened #41 to discuss the broader scope of minimizers and their implementation

Thanks for the input! I will take that into consideration! 😄

Copy link
Member

@pelssers pelssers left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for adding iminuit!
I've added a few comments on inference.py where there are a few places where things can be simplified and cleaned up.
We have to see how to make the bestfit arguments the same for different optimizers (tolerance specification and options such as use_hessian) and make sure it returns the same output for different optimizers, but its great that we can now compare different optimizers and error estimators.
Do you know if the tutorial notebook also runs with iminuit?

flamedisx/inference.py Show resolved Hide resolved
flamedisx/inference.py Outdated Show resolved Hide resolved
flamedisx/inference.py Show resolved Hide resolved
flamedisx/inference.py Outdated Show resolved Hide resolved
flamedisx/inference.py Show resolved Hide resolved
flamedisx/inference.py Outdated Show resolved Hide resolved
@pelssers pelssers requested review from pelssers and removed request for pelssers October 28, 2019 15:04
Copy link
Member

@JelleAalbers JelleAalbers left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for adding minuit Christian! I hope the optimizer will now be more stable with the scaling removed -- let's see how it works.

@CristianAntochi CristianAntochi merged commit 203096a into master Oct 30, 2019
@CristianAntochi CristianAntochi deleted the add_minuit branch October 30, 2019 15:28
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants