Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

added dropout #145

Open
wants to merge 4 commits into
base: master
Choose a base branch
from
Open

added dropout #145

wants to merge 4 commits into from

Conversation

Sentient07
Copy link

Added an example code of dropout over the existing MLP and Logistic Regression tutorial code. The code that i have attached is the full working version. It does reuse some modules of the existing code in the tutorial. I wasn't sure of the imports to be made for those modules. I will make those necessary imports and update this PR

@Sentient07
Copy link
Author

@lamblin I've made a small change to the logistic_sgd file. And i've added the necessary docstrings. Could you please check if this is fine before i add the documentation ?




class HiddenLayer(object):
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think you can just import it from mlp.py, to avoid duplication.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah, I had wanted to do it in the previous commit, somehow slipped out of my mind. Will do

@Sentient07
Copy link
Author

@lamblin , have updated the PR. I have addressed the comments. Is the code fine ? Shall i go ahead and write documentation for this ?

srng = theano.tensor.shared_randomstreams.RandomStreams(rng.randint(1000))
mask = srng.binomial(n=1, p=1-p, size=layer.shape)
output = layer*T.cast(mask, theano.config.floatX)
return output * (1 - p)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That should be output / (1 - p).

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Oh, yeah. I'm sorry. I had got confused.

@lamblin
Copy link
Member

lamblin commented Mar 14, 2016

It seems OK now, thanks!

@Sentient07
Copy link
Author

@lamblin I have added the doc file, please have a look at your convenience, whether the details added are correct and sufficient. Sorry for the delay. I got a little busy with my mid-term exams and GSoC work.

@Sentient07
Copy link
Author

@lamblin ping

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants