Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fixed float16 error wrt Theano=0.8.2 #871

Merged
merged 1 commit into from
Aug 1, 2017
Merged

Conversation

Sentient07
Copy link
Contributor

@Sentient07 Sentient07 commented Jul 31, 2017

Fixes #757.

@f0k
Copy link
Member

f0k commented Aug 1, 2017

Thank you, that's perfect, merging! For the future, please use one of the magic words (https://help.github.com/articles/closing-issues-using-keywords/), not "Addresses", as mentioned in the pull request guidelines. (I can easily change this myself, but I have to remember.) Cheers!

@f0k f0k merged commit 465241b into Lasagne:master Aug 1, 2017
@f0k
Copy link
Member

f0k commented Aug 1, 2017

A nuisance is that to actually avoid upcasting, one has to use a float16(0.5), since "Theano int8 - Python float" or "Theano float16 - Python float" is upcasted to float32. We might want to cast the constant and the probability to the input dtype instead, assuming that nobody will ever want to use the p dtype of a DropoutLayer intentionally to change the dtype of the result.

@f0k f0k mentioned this pull request Aug 1, 2017
@Sentient07
Copy link
Contributor Author

Sentient07 commented Aug 1, 2017 via email

@Sentient07 Sentient07 deleted the check-upcast branch August 1, 2017 17:54
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants