Stars
Simple, configuration-driven backup software for servers and workstations
A simple, performant and scalable Jax LLM!
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
Flax is a neural network library for JAX that is designed for flexibility.
Composable transformations of Python+NumPy programs: differentiate, vectorize, JIT to GPU/TPU, and more
Simple and efficient pytorch-native transformer text generation in <1000 LOC of python.
High-quality datasets, tools, and concepts for LLM fine-tuning.
Accelerate local LLM inference and finetuning (LLaMA, Mistral, ChatGLM, Qwen, Mixtral, Gemma, Phi, MiniCPM, Qwen-VL, MiniCPM-V, etc.) on Intel XPU (e.g., local PC with iGPU and NPU, discrete GPU su…
⚡️A Blazing-Fast Python Library for Ranking Evaluation, Comparison, and Fusion 🐍
Multipack distributed sampler for fast padding-free training of LLMs
Open Source Continuous File Synchronization
Finetune Llama 3.2, Mistral, Phi, Qwen 2.5 & Gemma LLMs 2-5x faster with 80% less memory
Rust+OpenCL+AVX2 implementation of LLaMA inference code
A minimal PyTorch re-implementation of the OpenAI GPT (Generative Pretrained Transformer) training
SteamOS session compositing window manager
PRQL is a modern language for transforming data — a simple, powerful, pipelined SQL replacement
The fastest way to develop full-stack web apps with React & Node.js.
📚 Papers & tech blogs by companies sharing their work on data science & machine learning in production.
A concise but complete full-attention transformer with a set of promising experimental features from various papers
Facebook AI Research Sequence-to-Sequence Toolkit written in Python.
Boltstream Live Video Streaming Website + Backend
An educational resource to help anyone learn deep reinforcement learning.