-
Notifications
You must be signed in to change notification settings - Fork 2.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
added dropout #145
base: master
Are you sure you want to change the base?
added dropout #145
Conversation
@lamblin I've made a small change to the logistic_sgd file. And i've added the necessary docstrings. Could you please check if this is fine before i add the documentation ? |
|
||
|
||
|
||
class HiddenLayer(object): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think you can just import it from mlp.py, to avoid duplication.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yeah, I had wanted to do it in the previous commit, somehow slipped out of my mind. Will do
@lamblin , have updated the PR. I have addressed the comments. Is the code fine ? Shall i go ahead and write documentation for this ? |
srng = theano.tensor.shared_randomstreams.RandomStreams(rng.randint(1000)) | ||
mask = srng.binomial(n=1, p=1-p, size=layer.shape) | ||
output = layer*T.cast(mask, theano.config.floatX) | ||
return output * (1 - p) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
That should be output / (1 - p)
.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Oh, yeah. I'm sorry. I had got confused.
It seems OK now, thanks! |
@lamblin I have added the doc file, please have a look at your convenience, whether the details added are correct and sufficient. Sorry for the delay. I got a little busy with my mid-term exams and GSoC work. |
@lamblin ping |
Added an example code of dropout over the existing MLP and Logistic Regression tutorial code. The code that i have attached is the full working version. It does reuse some modules of the existing code in the tutorial. I wasn't sure of the imports to be made for those modules. I will make those necessary imports and update this PR