Skip to content
This repository has been archived by the owner on Jan 15, 2024. It is now read-only.

[FIX] Update BERTLayerNorm Implementation #485

Merged
merged 3 commits into from
Dec 28, 2018

Conversation

eric-haibin-lin
Copy link
Member

Description

apache/mxnet#13732
There was a typo in the nn.LayerNorm documentation. It turns out that we can safely reuse nn.LayerNorm which is faster than the implementation in GluonNLP.
As a sanity check, I run compare_tf_gluon_model.py again with bert_24_1024_16/bert_12_768_12 on book_corpus_wiki_en_uncased/book_corpus_wiki_en_cased and the test passes.

@fierceX @haven-jeon

Checklist

Essentials

  • PR's title starts with a category (e.g. [BUGFIX], [MODEL], [TUTORIAL], [FEATURE], [DOC], etc)
  • Changes are complete (i.e. I finished coding on this PR)
  • All changes have test coverage
  • Code is well-documented

@mli
Copy link
Member

mli commented Dec 27, 2018

Job PR-485/3 is complete.
Docs are uploaded to http:https://gluon-nlp-staging.s3-accelerate.dualstack.amazonaws.com/PR-485/3/index.html

@codecov
Copy link

codecov bot commented Dec 27, 2018

Codecov Report

Merging #485 into master will increase coverage by 0.43%.
The diff coverage is 100%.

@@            Coverage Diff             @@
##           master     #485      +/-   ##
==========================================
+ Coverage   71.02%   71.46%   +0.43%     
==========================================
  Files         119      118       -1     
  Lines       10045     9937     -108     
==========================================
- Hits         7134     7101      -33     
+ Misses       2911     2836      -75
Flag Coverage Δ
#PR470 ?
#PR482 ?
#PR485 71.46% <100%> (?)
#master ?
#notserial 46.91% <66.66%> (+0.2%) ⬆️
#py2 71.21% <100%> (+0.43%) ⬆️
#py3 71.25% <100%> (-0.07%) ⬇️
#serial 56.97% <100%> (+0.5%) ⬆️

2 similar comments
@codecov
Copy link

codecov bot commented Dec 27, 2018

Codecov Report

Merging #485 into master will increase coverage by 0.43%.
The diff coverage is 100%.

@@            Coverage Diff             @@
##           master     #485      +/-   ##
==========================================
+ Coverage   71.02%   71.46%   +0.43%     
==========================================
  Files         119      118       -1     
  Lines       10045     9937     -108     
==========================================
- Hits         7134     7101      -33     
+ Misses       2911     2836      -75
Flag Coverage Δ
#PR470 ?
#PR482 ?
#PR485 71.46% <100%> (?)
#master ?
#notserial 46.91% <66.66%> (+0.2%) ⬆️
#py2 71.21% <100%> (+0.43%) ⬆️
#py3 71.25% <100%> (-0.07%) ⬇️
#serial 56.97% <100%> (+0.5%) ⬆️

@codecov
Copy link

codecov bot commented Dec 27, 2018

Codecov Report

Merging #485 into master will increase coverage by 0.43%.
The diff coverage is 100%.

@@            Coverage Diff             @@
##           master     #485      +/-   ##
==========================================
+ Coverage   71.02%   71.46%   +0.43%     
==========================================
  Files         119      118       -1     
  Lines       10045     9937     -108     
==========================================
- Hits         7134     7101      -33     
+ Misses       2911     2836      -75
Flag Coverage Δ
#PR470 ?
#PR482 ?
#PR485 71.46% <100%> (?)
#master ?
#notserial 46.91% <66.66%> (+0.2%) ⬆️
#py2 71.21% <100%> (+0.43%) ⬆️
#py3 71.25% <100%> (-0.07%) ⬇️
#serial 56.97% <100%> (+0.5%) ⬆️

Copy link
Member

@szha szha left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM assuming that the specialized layer norm block is for extension purpose

@eric-haibin-lin eric-haibin-lin merged commit 8154d80 into dmlc:master Dec 28, 2018
paperplanet pushed a commit to paperplanet/gluon-nlp that referenced this pull request Jun 9, 2019
* fix layer norm

* fix indentation

* fix lint
@eric-haibin-lin eric-haibin-lin deleted the layer-norm branch June 12, 2019 21:17
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

4 participants