-
Notifications
You must be signed in to change notification settings - Fork 1.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Negative perplexity values #1595
Comments
Hi! Sorry, been meaning to more clearly document the sample log formats and semantic meaning of per-sample metrics for e.g. perplexity. Here the per-sample lm-evaluation-harness/lm_eval/api/metrics.py Lines 36 to 38 in dc90fec
|
Thanks for clarifying! Any quick ways to get that too? |
Ah, unfortunately not currently--though you can tell based on the |
Ah, I see. Unfortunately, I need access to the |
Hey folks,
So, the perplexity values on a per sample/doc basis are ALL negative.
Can someone explain why this is ?
This is using the
--log-samples
option.Command:
Sample output:
The text was updated successfully, but these errors were encountered: