Skip to content

An implementation of model parallel autoregressive transformers on GPUs, based on the Megatron and DeepSpeed libraries

License

Notifications You must be signed in to change notification settings

EleutherAI/gpt-neox

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

58 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

GPT-NeoX

An implementation of model parallel GPT-3-like models on GPUs, based on the DeepSpeed library. Designed to be able to train models in the hundreds of billions of parameters or larger.

Requirements

$ pip install -r requirements.txt

Test deepspeed locally

$ deepspeed train_enwik8.py \
	--deepspeed \
	--deepspeed_config ./configs/base_deepspeed.json