Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Added ability to pass the gradient function to Adadelta #24

Open
wants to merge 1 commit into
base: master
Choose a base branch
from

Conversation

alansaul
Copy link
Collaborator

Adding ability to pass along the gradient function to use for Adadelta, necessary for doing stochastic gradients with GPy models.

@codecov-io
Copy link

codecov-io commented Nov 15, 2017

Codecov Report

Merging #24 into master will decrease coverage by 0.09%.
The diff coverage is 50%.

Impacted file tree graph

@@            Coverage Diff            @@
##           master      #24     +/-   ##
=========================================
- Coverage   96.68%   96.59%   -0.1%     
=========================================
  Files          26       26             
  Lines        2084     2087      +3     
  Branches      332      333      +1     
=========================================
+ Hits         2015     2016      +1     
- Misses         55       56      +1     
- Partials       14       15      +1
Impacted Files Coverage Δ
paramz/optimization/optimization.py 98.4% <50%> (-1.06%) ⬇️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 1500072...3f60b6f. Read the comment docs.

@mzwiessele
Copy link
Member

Why is the coverage going down here? Do we have the ability to put a test in place, which tests the stochastic gradient function?

@mzwiessele
Copy link
Member

We need to put in the test case here! Then we can merge :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants