Skip to content
/ gpt-neox Public
forked from EleutherAI/gpt-neox

An implementation of model parallel GPT-3-like models on GPUs, based on the DeepSpeed library. Designed to be able to train models in the hundreds of billions of parameters or larger.

License

Notifications You must be signed in to change notification settings

Inux/gpt-neox