Skip to content

Commit

Permalink
Merge branch 'optimize_refactor_finetune_fix' into 'optimize_refactor'
Browse files Browse the repository at this point in the history
Use new api to get loss scale when finetuning.

See merge request ADLR/megatron-lm!201
  • Loading branch information
Mohammad Shoeybi committed Jan 5, 2021
2 parents 7381754 + a13cbe1 commit 512337f
Showing 1 changed file with 2 additions and 1 deletion.
3 changes: 2 additions & 1 deletion tasks/finetune_utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -186,7 +186,8 @@ def _train(model, optimizer, lr_scheduler, forward_step,
# Logging.
report_memory_flag = training_log(losses_dict, losses_dict_sum,
optimizer.param_groups[0]['lr'],
iteration, optimizer.loss_scale,
iteration,
optimizer.get_loss_scale().item(),
report_memory_flag, skipped_iter)

# Autoresume
Expand Down

0 comments on commit 512337f

Please sign in to comment.