Conversation
|
@lamblin I've made a small change to the logistic_sgd file. And i've added the necessary docstrings. Could you please check if this is fine before i add the documentation ? |
code/dropout.py
Outdated
|
|
||
|
|
||
|
|
||
| class HiddenLayer(object): |
There was a problem hiding this comment.
I think you can just import it from mlp.py, to avoid duplication.
There was a problem hiding this comment.
Yeah, I had wanted to do it in the previous commit, somehow slipped out of my mind. Will do
|
@lamblin , have updated the PR. I have addressed the comments. Is the code fine ? Shall i go ahead and write documentation for this ? |
code/dropout.py
Outdated
| srng = theano.tensor.shared_randomstreams.RandomStreams(rng.randint(1000)) | ||
| mask = srng.binomial(n=1, p=1-p, size=layer.shape) | ||
| output = layer*T.cast(mask, theano.config.floatX) | ||
| return output * (1 - p) |
There was a problem hiding this comment.
That should be output / (1 - p).
There was a problem hiding this comment.
Oh, yeah. I'm sorry. I had got confused.
|
It seems OK now, thanks! |
|
@lamblin I have added the doc file, please have a look at your convenience, whether the details added are correct and sufficient. Sorry for the delay. I got a little busy with my mid-term exams and GSoC work. |
|
@lamblin ping |
Added an example code of dropout over the existing MLP and Logistic Regression tutorial code. The code that i have attached is the full working version. It does reuse some modules of the existing code in the tutorial. I wasn't sure of the imports to be made for those modules. I will make those necessary imports and update this PR