You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Describe the bug
It appears that imbalances in the distillation weights has a significant impact on performance. When I set them all equal to 1, it runs twice as fast as when I set lm_loss to 1 and the other two to 0.1.
To Reproduce
Run a med-to-small distillation with weights 1, 1, 1
Run med-to-small distillation with weights 1, 0.5, 0.5
Run a med-to-small distillation with weights 1, 0.1, 0.1
Expected behavior
I did not expect such great variation.
Proposed solution
No clue
Screenshots
Environment (please complete the following information):
GPUs: EleutherAI V100 cluster
Configs: The defaults in the distill-gpt-neox branch
Additional context
Add any other context about the problem here.
The text was updated successfully, but these errors were encountered:
Describe the bug
It appears that imbalances in the distillation weights has a significant impact on performance. When I set them all equal to 1, it runs twice as fast as when I set lm_loss to 1 and the other two to 0.1.
To Reproduce
Expected behavior
I did not expect such great variation.
Proposed solution
No clue
Screenshots
Environment (please complete the following information):
distill-gpt-neox
branchAdditional context
Add any other context about the problem here.
The text was updated successfully, but these errors were encountered: