Skip to content

Commit

Permalink
Set loss-scale to 1 to prevent using dynamic scaling
Browse files Browse the repository at this point in the history
  • Loading branch information
sean.narenthiran committed Feb 20, 2020
1 parent 98433d6 commit f151c46
Showing 1 changed file with 2 additions and 1 deletion.
3 changes: 2 additions & 1 deletion train.py
Original file line number Diff line number Diff line change
Expand Up @@ -80,7 +80,8 @@
parser.add_argument('--seed', default=123456, type=int, help='Seed to generators')
parser.add_argument('--opt-level', type=str)
parser.add_argument('--keep-batchnorm-fp32', type=str, default=None)
parser.add_argument('--loss-scale', type=str, default=None)
parser.add_argument('--loss-scale', default=1,
help='Loss scaling used by Apex. Default is 1 due to warp-ctc not supporting scaling of gradients')

torch.manual_seed(123456)
torch.cuda.manual_seed_all(123456)
Expand Down

0 comments on commit f151c46

Please sign in to comment.