Lists (1)
Sort Name ascending (A-Z)
Stars
Educational framework exploring ergonomic, lightweight multi-agent orchestration. Managed by OpenAI Solution team.
🌟 The Multi-Agent Framework: First AI Software Company, Towards Natural Language Programming
🐫 CAMEL: Finding the Scaling Law of Agents. A multi-agent framework. https://www.camel-ai.org
Create Customized Software using Natural Language Idea (through LLM-powered Multi-Agent Collaboration)
This Repo is the official implementation of AgentCoder and AgentCoder+.
Author's PyTorch implementation of "Attention-Based Deep Spiking Neural Networks for Temporal Credit Assignment Problems", TNNLS 2023
A cloud-native vector database, storage for next generation AI applications
CodeRAG-Bench: Can Retrieval Augment Code Generation?
Pyserini is a Python toolkit for reproducible information retrieval research with sparse and dense representations.
[TMLR] A curated list of language modeling researches for code and related datasets.
Official implementation for the paper: "Code Generation with AlphaCodium: From Prompt Engineering to Flow Engineering""
CodeGeeX: An Open Multilingual Code Generation Model (KDD 2023)
An incremental parsing system for programming tools
CodeGeeX4-ALL-9B, a versatile model for all AI software development scenarios, including code completion, code interpreter, web search, function calling, repository-level Q&A and much more.
Qwen2.5 is the large language model series developed by Qwen team, Alibaba Cloud.
LLMs interview notes and answers:该仓库主要记录大模型(LLMs)算法工程师相关的面试题和参考答案
LLMs interview notes and answers:该仓库主要记录大模型(LLMs)算法工程师相关的面试题和参考答案
刷算法全靠套路,认准 labuladong 就够了!English version supported! Crack LeetCode, not only how, but also why.
DeepSpeed is a deep learning optimization library that makes distributed training and inference easy, efficient, and effective.
A tutorial based on MetaGPT to quickly help you understand the concept of agent and muti-agent and get started with coding development. 基于MetaGPT的多智能体入门与开发教程
Code for loralib, an implementation of "LoRA: Low-Rank Adaptation of Large Language Models"
Unified Efficient Fine-Tuning of 100+ LLMs (ACL 2024)
Chat-甄嬛是利用《甄嬛传》剧本中所有关于甄嬛的台词和语句,基于ChatGLM2进行LoRA微调得到的模仿甄嬛语气的聊天语言模型。