Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Changing distillation weights changes runtime #372

Closed
StellaAthena opened this issue Jul 10, 2021 · 1 comment
Closed

Changing distillation weights changes runtime #372

StellaAthena opened this issue Jul 10, 2021 · 1 comment
Assignees
Labels
bug Something isn't working

Comments

@StellaAthena
Copy link
Member

Describe the bug
It appears that imbalances in the distillation weights has a significant impact on performance. When I set them all equal to 1, it runs twice as fast as when I set lm_loss to 1 and the other two to 0.1.

To Reproduce

  1. Run a med-to-small distillation with weights 1, 1, 1
  2. Run med-to-small distillation with weights 1, 0.5, 0.5
  3. Run a med-to-small distillation with weights 1, 0.1, 0.1

Expected behavior
I did not expect such great variation.

Proposed solution
No clue

Screenshots

image

Environment (please complete the following information):

  • GPUs: EleutherAI V100 cluster
  • Configs: The defaults in the distill-gpt-neox branch

Additional context
Add any other context about the problem here.

@StellaAthena StellaAthena added the bug Something isn't working label Jul 10, 2021
@StellaAthena
Copy link
Member Author

Abandoning distilling efforts

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants