Stars
整理开源的中文大语言模型,以规模较小、可私有化部署、训练成本较低的模型为主,包括底座模型,垂直领域微调及应用,数据集与教程等。
闻达:一个LLM调用平台。目标为针对特定环境的高效内容生成,同时考虑个人和中小企业的计算资源局限性,以及知识安全和私密性问题
[NeurIPS 2023] Tree of Thoughts: Deliberate Problem Solving with Large Language Models
The official GitHub page for the survey paper "A Survey of Large Language Models".
Official implementation for "Automatic Chain of Thought Prompting in Large Language Models" (stay tuned & more will be updated)
awesome grounding: A curated list of research papers in visual grounding
搜索、推荐、广告、用增等工业界实践文章收集(来源:知乎、Datafuntalk、技术公众号)
pycorrector is a toolkit for text error correction. 文本纠错,实现了Kenlm,T5,MacBERT,ChatGLM3,LLaMA等模型应用在纠错场景,开箱即用。
为GPT/GLM等LLM大语言模型提供实用化交互接口,特别优化论文阅读/润色/写作体验,模块化设计,支持自定义快捷按钮&函数插件,支持Python和C++等项目剖析&自译解功能,PDF/LaTex论文翻译&总结功能,支持并行问询多种LLM模型,支持chatglm3等本地模型。接入通义千问, deepseekcoder, 讯飞星火, 文心一言, llama2, rwkv, claude2, m…
Fengshenbang-LM(封神榜大模型)是IDEA研究院认知计算与自然语言研究中心主导的大模型开源体系,成为中文AIGC和认知智能的基础设施。
🤗 PEFT: State-of-the-art Parameter-Efficient Fine-Tuning.
ChatGLM2-6B: An Open Bilingual Chat LLM | 开源双语对话语言模型
Aspect Based Sentiment Analysis 基于方面的细粒度情感分析
Pre-Training with Whole Word Masking for Chinese BERT(中文BERT-wwm系列模型)
Sentiment Analysis, Text Classification, Text Augmentation, Text Adversarial defense, etc.;
code for our NAACL 2019 paper: "BERT Post-Training for Review Reading Comprehension and Aspect-based Sentiment Analysis"
Chinese NER(Named Entity Recognition) using BERT(Softmax, CRF, Span)
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
A PyTorch Implementation of Neural IMage Assessment
CLIP (Contrastive Language-Image Pretraining), Predict the most relevant text snippet given an image
optimized BERT transformer inference on NVIDIA GPU. https://arxiv.org/abs/2210.03052
🛠 A lite C++ toolkit of awesome AI models, support ONNXRuntime, MNN, TNN, NCNN and TensorRT.
[EE451] Using CUDA to Accelerate Multi-head Attention in Vision Transformers
Samples for CUDA Developers which demonstrates features in CUDA Toolkit