Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix confidence loss to scale up correctly #33

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Commits on Feb 13, 2024

  1. Fix confidence loss to scale up correctly

    Previously, weighting of confidence loss would jump from `step_frac` just before `step_frac > warmup_frac` to `1.0` when `step_frac > warmup_frac`.
    
    For example, if `warmup_frac = 0.1`, then when `step_frac = 0.1`, `coef = step_frac = 0.1`, and then at the next step `step_frac > 0.1 = warmup_frac`, so `coef = 1.0`.
    
    The paper describes this weighting as increasing smoothly. The change makes it increase linearly from 0 to 1 during the first `warmup_frac` steps.
    RobertKirk committed Feb 13, 2024
    Configuration menu
    Copy the full SHA
    bfed162 View commit details
    Browse the repository at this point in the history