Stars
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
A natural language interface for computers
🌟 The Multi-Agent Framework: First AI Software Company, Towards Natural Language Programming
TensorFlow code and pre-trained models for BERT
OpenAssistant is a chat-based assistant that understands tasks, can interact with third-party systems, and retrieve information dynamically to do so.
DeepSpeed is a deep learning optimization library that makes distributed training and inference easy, efficient, and effective.
Ray is a unified framework for scaling AI and Python applications. Ray consists of a core distributed runtime and a set of AI Libraries for accelerating ML workloads.
Facebook AI Research Sequence-to-Sequence Toolkit written in Python.
Composable transformations of Python+NumPy programs: differentiate, vectorize, JIT to GPU/TPU, and more
Code and documentation to train Stanford's Alpaca models, and generate the data.
A high-throughput and memory-efficient inference and serving engine for LLMs
Repository to track the progress in Natural Language Processing (NLP), including the datasets and the current state-of-the-art for the most common NLP tasks.
The official Python library for the OpenAI API
Graph Neural Network Library for PyTorch
The ChatGPT Retrieval Plugin lets you easily find personal or work documents by asking questions in natural language.
Universal LLM Deployment Engine with ML Compilation
DSPy: The framework for programming—not prompting—foundation models
🤗 PEFT: State-of-the-art Parameter-Efficient Fine-Tuning.
Fast and memory-efficient exact attention
Python package built to ease deep learning on graph, on top of existing DL frameworks.
SWE-agent takes a GitHub issue and tries to automatically fix it, using GPT-4, or your LM of choice. It can also be employed for offensive cybersecurity or competitive coding challenges.
Gorilla: Training and Evaluating LLMs for Function Calls (Tool Calls)
A framework for training and evaluating AI models on a variety of openly available dialogue datasets.
Large Language Model Text Generation Inference
Deep universal probabilistic programming with Python and PyTorch
Hackable and optimized Transformers building blocks, supporting a composable construction.
An Autonomous LLM Agent for Complex Task Solving
Stanford NLP Python library for tokenization, sentence segmentation, NER, and parsing of many human languages
The easiest way to use deep metric learning in your application. Modular, flexible, and extensible. Written in PyTorch.