-
Notifications
You must be signed in to change notification settings - Fork 26
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
working auto differentiation, enables the hmc on test cases. #53
base: master
Are you sure you want to change the base?
Conversation
… autodiff fails reverts to finite differencing with obvious performance degradation
Hello @wdpozzo! Thanks for updating this PR. We checked the lines you've touched for PEP 8 issues, and found:
Comment last updated at 2020-02-27 21:03:06 UTC |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Quite a lot of changes to digest here but there's some stuff missing that you'll need to add to the repo before I can test more.
directional derivatives of the likelihood | ||
at x. | ||
|
||
It uses jax.grad for automatic differentiation, but reverts to |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should jax be added to the requirements in setup.py then?
I pushed the changes you requested |
cpnest/proposal.py
Outdated
try: | ||
from jax import grad, jit | ||
FINITEDIFFERENCING = False | ||
except: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Don't use a bare except. Use except ImportError
if that is what you want to catch
self.analytic_log_Z=0.0 - sum([np.log(self.bounds[i][1]-self.bounds[i][0]) for i in range(self.dim)]) | ||
|
||
def log_likelihood(self,p): | ||
return np.sum([-0.5*p[n]**2-0.5*np.log(2.0*np.pi) for n in p.names])##np.sum([self.distr.logpdf(p[n] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why did you change the way of getting the parameter? Isn't this going to be slower?
Hi @wdpozzo am seeing this on the branch but I can't work out why.
|
whenever autodiff fails reverts to finite differencing with obvious performance degradation. Also fixes a bug in the differential evolution proposal.