-
Notifications
You must be signed in to change notification settings - Fork 147
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Loss functions #42
Comments
Hi Peter, thanks for the comment. You are right we should have stated the loss used. Sorry about that. We used cross entropy |
Thanks for the quick answer! I just load your unet trained model to try TL and see that the last activation layer is LogSoftMax. Could be NLLLoss more appropriate? Cheers! |
You can either remove the logsoftmax and use cross entropy or keep it and use nllloss. I guess it won't make much difference. I think I added the logsoftmax for this inference code only and used cross entropy for training |
Sure! Just curious about stability and repeatability |
Hi all;
I have been reading the paper and I realised that the loss functions are not explicitly shown. Are you using the original loss functions described as in it architecture references?
Best!
The text was updated successfully, but these errors were encountered: