Highlights
- Pro
Block or Report
Block or report okarthikb
Contact GitHub support about this user’s behavior. Learn more about reporting abuse.
Report abuseStars
Language: Python
Sort by: Most stars
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
Composable transformations of Python+NumPy programs: differentiate, vectorize, JIT to GPU/TPU, and more
Pretrain, finetune and deploy AI models on multiple GPUs, TPUs with zero code changes.
You like pytorch? You like micrograd? You love tinygrad! ❤️
A high-throughput and memory-efficient inference and serving engine for LLMs
Code for the paper "Language Models are Unsupervised Multitask Learners"
A minimal PyTorch re-implementation of the OpenAI GPT (Generative Pretrained Transformer) training
Ongoing research training transformer models at scale
Implementation of Denoising Diffusion Probabilistic Model in Pytorch
An implementation of model parallel autoregressive transformers on GPUs, based on the Megatron and DeepSpeed libraries
Use commands in English to control Blender with OpenAI's GPT-4
An unnecessarily tiny implementation of GPT-2 in NumPy.
Large language models (LLMs) made easy, EasyLM is a one stop solution for pre-training, finetuning, evaluating and serving LLMs in JAX/Flax.
🦁 Lion, new optimizer discovered by Google Brain using genetic algorithms that is purportedly better than Adam(w), in Pytorch
Elegant easy-to-use neural networks + scientific computing in JAX. https://docs.kidger.site/equinox/
Numerical differential equation solvers in JAX. Autodifferentiable and GPU-capable. https://docs.kidger.site/diffrax/
A library for mechanistic interpretability of GPT-style language models