Skip to content
This repository has been archived by the owner on Mar 17, 2021. It is now read-only.

Layers should allow more options than just 'batch norm' and 'group norm' (instance norm?) #280

Open
Zach-ER opened this issue Nov 8, 2018 · 4 comments
Assignees

Comments

@Zach-ER
Copy link
Collaborator

Zach-ER commented Nov 8, 2018

Layers have a with_bn option that is over-ridden for group normalization by having a positive group size. Instead of this flag, we should have a bn_type string that determines which type of normalization to apply.

@Zach-ER Zach-ER self-assigned this Nov 8, 2018
@tvercaut
Copy link
Member

tvercaut commented Nov 8, 2018

I find the variable name bn_type confusing if you end up not using BN... Maybe replace it with featnorm_type or something along these lines?

@tvercaut
Copy link
Member

Should we take the opportunity of this PR to address #285 at the same time?

In the spirit of TF, we could also maybe go for feature_normalization as a flag name (see discussions in #282).

@wyli
Copy link
Member

wyli commented Nov 12, 2018

fixing #285 will break some of the model zoo items because of the variable name scopes... So we need another PR for #285, probably updating the model zoo items as well.

@LucasFidon
Copy link
Collaborator

LucasFidon commented Nov 28, 2018

To my understanding, Instance Normalization is a special case of Group Normalization when group_size is equal to 1. So one can already use Instance Normalization in the current setting.

On this line, maybe the class InstanceNormLayer in niftynet.layer.bn could be removed or at least marked as deprecated.

I agree the flags could be made more clear though.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

No branches or pull requests

5 participants