Skip to content
This repository has been archived by the owner on Nov 17, 2023. It is now read-only.

Fix BatchNorm converter for CoreML when fix_gamma=True #13557

Merged
merged 1 commit into from
Jan 16, 2019
Merged

Fix BatchNorm converter for CoreML when fix_gamma=True #13557

merged 1 commit into from
Jan 16, 2019

Conversation

CyberZHG
Copy link
Contributor

@CyberZHG CyberZHG commented Dec 6, 2018

Description

Force gamma values to be ones while converting batch normalization with fix_gamma=True in case of accidental changes of gamma (e.g. #9624).

Checklist

Essentials

  • Changes are complete
  • All changes have test coverage:
  • Code is well-documented:
  • To the my best knowledge, examples are either not affected by this change, or have been fixed to be compatible with this change

@CyberZHG CyberZHG requested a review from szha as a code owner December 6, 2018 07:18
@nswamy nswamy added the pr-awaiting-review PR is waiting for code review label Dec 7, 2018
@@ -445,7 +445,7 @@ def test_tiny_conv_random_input_multi_filter(self):
self._test_mxnet_model(net, input_shape=input_shape, mode='random')

def test_conv_random(self):
np.random.seed(1988)
np.random.seed(1986)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

wondering why you changed the seed value here from 1988 to 1986, is there any significance to this number?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Because I wasn't able to pass the tests (test_conv_random, test_transpose) before changing any code.

Copy link
Member

@nswamy nswamy left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the contribution!

@nswamy nswamy added Converter pr-awaiting-response PR is reviewed and waiting for contributor to respond and removed pr-awaiting-review PR is waiting for code review labels Dec 7, 2018
@Roshrini
Copy link
Member

@CyberZHG Did you run tests multiple times to see if with new seed value tests are passing?
@nswamy Can you take a look at this PR again?

@Roshrini
Copy link
Member

Roshrini commented Jan 3, 2019

@CyberZHG Could you please run the test for 5k times to make sure that its not flaky? Thanks!

@CyberZHG
Copy link
Contributor Author

CyberZHG commented Jan 4, 2019

@Roshrini

The results are always exactly the same after setting the seed values.

I triggered the CI (uncomment the line in .travis.yml file) for the converter and it can pass the tests with both the original seeds and the new seeds. Therefore it is not necessary to change the seed values to pass the CI tasks.

Besides, I've done some calculations manually for convolution and batch normalization, and the results are the same as mxnet's outputs. However, there are always some slight differences between coreml's outputs and mxnet's outputs in my machine. I think there may be some environmental differences and the new seeds are more compatible with different versions.

In summary, currently all tests could pass the CI tasks and I can change back the seed values if it is insisted.

@stu1130
Copy link
Contributor

stu1130 commented Jan 16, 2019

@nswamy could you take a look at it?
@mxnet-label-bot update [pr-awaiting-review, Converter]

@marcoabreu marcoabreu added pr-awaiting-review PR is waiting for code review Converter and removed Converter pr-awaiting-response PR is reviewed and waiting for contributor to respond labels Jan 16, 2019
the final trained model.
"""
np.random.seed(1988)
input_shape = (1, 2, 2, 3)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why not use (1, 1, 2, 3) as in previous test_batch_norm() tests? I think it can be helpful to choose the same input_shape to test the default behavior when fix_gamma is not specified. Ideally, the result should be the same as you supply fix_gamma=True.

Copy link
Contributor

@apeforest apeforest left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

some comment about unit test. otherwise LGTM. thanks for your contribution!

@CyberZHG
Copy link
Contributor Author

@apeforest I have modified the random seeds and the input shape with a new commit. The testing results can be seen at travis-ci.

Copy link
Contributor

@apeforest apeforest left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@Roshrini Roshrini added the CoreML Issues related to CoreML converter label Jan 16, 2019
@Roshrini Roshrini merged commit 0f7d33d into apache:master Jan 16, 2019
stephenrawls pushed a commit to stephenrawls/incubator-mxnet that referenced this pull request Feb 16, 2019
haohuanw pushed a commit to haohuanw/incubator-mxnet that referenced this pull request Jun 23, 2019
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
Converter CoreML Issues related to CoreML converter pr-awaiting-review PR is waiting for code review
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

6 participants