Lists (3)
Sort Name ascending (A-Z)
Stars
38
results
for source starred repositories
Clear filter
整理开源的中文大语言模型,以规模较小、可私有化部署、训练成本较低的模型为主,包括底座模型,垂直领域微调及应用,数据集与教程等。
A Chinese medical ChatGPT based on LLaMa, training from large-scale pretrain corpus and multi-turn dialogue dataset.
用于从头预训练+SFT一个小参数量的中文LLaMa2的仓库;24G单卡即可运行得到一个具备简单中文问答能力的chat-llama2.
This is a medical bot built using Llama2 and Sentence Transformers. The bot is powered by Langchain and Chainlit. The bot runs on a decent CPU machine with a minimum of 16GB of RAM.
TensorFlow code and pre-trained models for BERT
ChatGPT WechatBot is a kind of chatGPT robot based on OpenAI official API and dialogue model, which is deployed on WeChat through Wechat framework to achieve robot chat(ChatGPT-WechatBot是基于OpenAI官方…