Skip to content
This repository has been archived by the owner on Mar 21, 2024. It is now read-only.

Add dropout to DeepMIL and fix feature extractor setup #653

Merged
merged 9 commits into from
Feb 7, 2022

Conversation

dccastro
Copy link
Member

@dccastro dccastro commented Feb 3, 2022

No description provided.

@harshita-s harshita-s self-requested a review February 4, 2022 14:40
harshita-s
harshita-s previously approved these changes Feb 4, 2022
feature_extractor = nn.Sequential(*layers)
with no_grad():
feature_shape = feature_extractor(rand(1, *input_dim)).shape
num_features = int(prod(as_tensor(feature_shape)).item())
# fix weights, no fine-tuning
for param in feature_extractor.parameters():
param.requires_grad = False
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This should not be set False for fine-tuning, right?

@harshita-s harshita-s self-requested a review February 7, 2022 13:05
@dccastro dccastro merged commit eda7635 into main Feb 7, 2022
@dccastro dccastro deleted the dacoelh/deepmil_dropout branch February 7, 2022 13:09
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants