Skip to content

ZJWang9928/Lookahead-Optimizer-by-Pytorch

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

14 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Lookahead Optimizer by Pytorch

PyTorch implement of Lookahead Optimizer: k steps forward, 1 step back

Pseudocode for Lookahead Optimizer Algorithm:

avatar

Usage:

import lookahead

base_opt = torch.optim.Adam(model.parameters(), lr=1e-3, betas=(0.9, 0.999)) # Any optimizer
lookahead = Lookahead(base_opt, k=5, alpha=0.5) # Initialize Lookahead
lookahead.zero_grad()
loss_function(model(input), target).backward() # Self-defined loss function
lookahead.step()

Lookahead优化器的Pytorch实现

论文《Lookahead Optimizer: k steps forward, 1 step back》的PyTorch实现。

Lookahead优化器算法伪代码:

avatar

用法:

import lookahead

base_opt = torch.optim.Adam(model.parameters(), lr=1e-3, betas=(0.9, 0.999)) # Any optimizer
lookahead = Lookahead(base_opt, k=5, alpha=0.5) # Initialize Lookahead
lookahead.zero_grad()
loss_function(model(input), target).backward() # Self-defined loss function
lookahead.step()

Releases

No releases published

Packages

No packages published

Languages