-
Notifications
You must be signed in to change notification settings - Fork 77
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
How does the loss function for grade task work (CNN-only)? #18
Comments
Output neuron is 1 for the Cox Loss, but 3 for the GBMLGG grading task. You can see here where the last layer gets defined, which will vary based on the task. |
Oh my bad 😥 I’ve read the network.py from start to end and didn’t notice this🤦🏻♀️ Thanks a lot for the prompt response. |
No worries! Happy to help :) |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Hi, Richard. Your code has inspired me greatly:)
I'm not sure how you trained the vgg_19_bn to classify the grade; as I can see in your customised CNN, the output nueoron is one (with a probability ranging from -3 to 3); correct me if I'm wrong. However, in the CSV file, the grade has three labels: [0,1,2].
So, as far as I understand, the cnn model's output shape will be = [batch size,C), which is a one probability in this code.
For instance in case the batch size =2:
Though, this code snippet confuses me
If the label dim = 3, how does the loss function work?
loss nll = F.nll loss(pred, grade)
In this case, pred and grade have different shapes.Is there anything I'm missing or don't understand?
Thanks in advance,
Omnia
The text was updated successfully, but these errors were encountered: