-
Huazhong University of Science and Technology (华中科技大学) (HUST)
- Wuhan, Hubei, China
Highlights
- Pro
Stars
标注自己的数据集,训练、评估、测试、部署自己的人工智能算法
Translate PDF, EPub, webpage, metadata, annotations, notes to the target language. Support 20+ translate services.
Simple frontend for LLMs built in react-native.
YOLOv10 trained on DocLayNet dataset.
Unified Efficient Fine-Tuning of 100+ LLMs (ACL 2024)
敏感词库旨在建立一个词汇集,用于识别和过滤文本内容中的不当或不适宜的语言,以保护用户免受有害信息的影响并维持沟通环境的健康。
A throughput-oriented high-performance serving framework for LLMs
The paper list of the 86-page paper "The Rise and Potential of Large Language Model Based Agents: A Survey" by Zhiheng Xi et al.
Benchmarking large language models' complex reasoning ability with chain-of-thought prompting
An easy-to-use LLMs quantization package with user-friendly apis, based on GPTQ algorithm.
AutoAWQ implements the AWQ algorithm for 4-bit quantization with a 2x speedup during inference. Documentation:
A programming framework for agentic AI 🤖
Implementation of the algorithm detailed in ElKishky, Ahmed, et al. "Scalable topical phrase mining from text corpora."
基于HTML5的吃豆人游戏 - 经典游戏开发样例_Pacman based on HTML5
A simulated world consist of several auto-drived and unpredictable AI based on GPT
Langchain-Chatchat(原Langchain-ChatGLM)基于 Langchain 与 ChatGLM, Qwen 与 Llama 等语言模型的 RAG 与 Agent 应用 | Langchain-Chatchat (formerly langchain-ChatGLM), local knowledge based LLM (like ChatGLM, Qwen and…
收录NLP竞赛策略实现、各任务baseline、相关竞赛经验贴(当前赛事、往期赛事、训练赛)、NLP会议时间、常用自媒体、GPU推荐等,持续更新中
Fine-tuning ChatGLM-6B with PEFT | 基于 PEFT 的高效 ChatGLM 微调
A pytorch implementation of "Domain-Adaptive Few-Shot Learning"
DTWNet: a Dynamic Time Warping Network
得意黑 Smiley Sans:一款在人文观感和几何特征中寻找平衡的中文黑体
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
A high-performance, zero-overhead, extensible Python compiler using LLVM
GPU-accelerated force graph layout and rendering
Making large AI models cheaper, faster and more accessible
real Transformer TeraFLOPS on various GPUs