Skip to content
This repository has been archived by the owner on Nov 17, 2023. It is now read-only.

Fp16 support for layernorm #14073

Closed
eric-haibin-lin opened this issue Feb 6, 2019 · 5 comments
Closed

Fp16 support for layernorm #14073

eric-haibin-lin opened this issue Feb 6, 2019 · 5 comments

Comments

@eric-haibin-lin
Copy link
Member

Currently, given fp16 inputs, nd.LayerNorm/sym.LayerNorm perform reduction in fp16, which losses precision. The reduction should be done in fp32 instead. @sxjscience

@mxnet-label-bot
Copy link
Contributor

Hey, this is the MXNet Label Bot.
Thank you for submitting the issue! I will try and suggest some labels so that the appropriate MXNet community members can help resolve it.
Here are my recommended labels: Feature

@sxjscience
Copy link
Member

sxjscience commented Apr 2, 2019

Current I propose to solve the issue following these two steps:

@haojin2
Copy link
Contributor

haojin2 commented Apr 12, 2019

Once #14616 is merged then we can simply switch the LayerNorm's reduction to the safe version to achieve the 1st step @sxjscience proposed. Then we can possibly explore the implementation of the 2nd step later. @eric-haibin-lin What do you think?

@sxjscience
Copy link
Member

@haojin2 Yes, should first try to directly change the reduce to the safe version.

@sxjscience sxjscience mentioned this issue Apr 15, 2019
6 tasks
@leezu
Copy link
Contributor

leezu commented Jun 5, 2019

This can probably be closed as #15002 is merged

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

No branches or pull requests

5 participants