Popular repositories Loading
-
finetune-gpt2xl
finetune-gpt2xl PublicForked from Xirider/finetune-gpt2xl
Guide: Finetune GPT2-XL (1.5 Billion Parameters) and finetune GPT-NEO (2.7 B) on a single GPU with Huggingface Transformers using DeepSpeed
Python
-
gpt-neox
gpt-neox PublicForked from EleutherAI/gpt-neox
An implementation of model parallel autoregressive transformers on GPUs, based on the DeepSpeed library.
Python
-
Transformers4Rec
Transformers4Rec PublicForked from NVIDIA-Merlin/Transformers4Rec
Transformers4Rec is a flexible and efficient library for sequential and session-based recommendation, available for both PyTorch and Tensorflow.
Python
-
recommenders
recommenders PublicForked from recommenders-team/recommenders
Best Practices on Recommendation Systems
Python
-
horovod
horovod PublicForked from horovod/horovod
Distributed training framework for TensorFlow, Keras, PyTorch, and Apache MXNet.
Python
-
argo-workflows
argo-workflows PublicForked from argoproj/argo-workflows
Workflow engine for Kubernetes
Go
Repositories
- superinsight-db Public Forked from VimMaster80/superinsight-db
Relational Database for Unstructured Data
high5ai/superinsight-db’s past year of commit activity - gpt-neox Public Forked from EleutherAI/gpt-neox
An implementation of model parallel autoregressive transformers on GPUs, based on the DeepSpeed library.
high5ai/gpt-neox’s past year of commit activity - horovod Public Forked from horovod/horovod
Distributed training framework for TensorFlow, Keras, PyTorch, and Apache MXNet.
high5ai/horovod’s past year of commit activity - recommenders Public Forked from recommenders-team/recommenders
Best Practices on Recommendation Systems
high5ai/recommenders’s past year of commit activity - Transformers4Rec Public Forked from NVIDIA-Merlin/Transformers4Rec
Transformers4Rec is a flexible and efficient library for sequential and session-based recommendation, available for both PyTorch and Tensorflow.
high5ai/Transformers4Rec’s past year of commit activity - finetune-gpt2xl Public Forked from Xirider/finetune-gpt2xl
Guide: Finetune GPT2-XL (1.5 Billion Parameters) and finetune GPT-NEO (2.7 B) on a single GPU with Huggingface Transformers using DeepSpeed
high5ai/finetune-gpt2xl’s past year of commit activity
People
This organization has no public members. You must be a member to see who’s a part of this organization.
Top languages
Loading…
Most used topics
Loading…