Skip to content

Commit

Permalink
LR scheduler fix no longer breaks inference (#1060)
Browse files Browse the repository at this point in the history
* Add lr_scheduler check for inference.

Signed-off-by: Dashiell Stander <[email protected]>

* Update NeoXArgs docs automatically

---------

Signed-off-by: Dashiell Stander <[email protected]>
Co-authored-by: github-actions <[email protected]>
  • Loading branch information
dashstander and github-actions committed Oct 17, 2023
1 parent f6ac04d commit e001a04
Show file tree
Hide file tree
Showing 2 changed files with 4 additions and 2 deletions.
2 changes: 1 addition & 1 deletion configs/neox_arguments.md
Original file line number Diff line number Diff line change
Expand Up @@ -111,7 +111,7 @@ Logging Arguments

- **git_hash**: str

Default = ec71f71
Default = a97bd1f

current git hash of repository

Expand Down
4 changes: 3 additions & 1 deletion megatron/training.py
Original file line number Diff line number Diff line change
Expand Up @@ -686,7 +686,9 @@ def setup_model_and_optimizer(neox_args, use_cache=False, iteration=None):
neox_args.iteration = 0

# need this for correct lr scheduling resume from ckpt
lr_scheduler.optimizer = model.optimizer
# but it will not exist if this is being called for inference
if lr_scheduler is not None:
lr_scheduler.optimizer = model.optimizer

return model, optimizer, lr_scheduler

Expand Down

0 comments on commit e001a04

Please sign in to comment.