-
Alibaba
- Hangzhou, China
- chenhedong.cn
Stars
Chinese version of GPT2 training code, using BERT tokenizer.
A PyTorch implementation of a BiLSTM\BERT\Roberta(+CRF) model for Named Entity Recognition.
Chinese-LLaMA 1&2、Chinese-Falcon 基础模型;ChatFlow中文对话模型;中文OpenLLaMA模型;NLP预训练/指令微调数据集
Fine-tuning ChatGLM-6B with PEFT | 基于 PEFT 的高效 ChatGLM 微调
📋 A list of open LLMs available for commercial use.
Awesome-LLM: a curated list of Large Language Model
Offsite-Tuning: Transfer Learning without Full Model
An optimized deep prompt tuning strategy comparable to fine-tuning across scales and tasks
An easy-to-use federated learning platform
code for EMNLP 2019 paper Text Summarization with Pretrained Encoders
FactSumm: Factual Consistency Scorer for Abstractive Summarization
An Open-Source Package for Knowledge Embedding (KE)
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
Code & data accompanying the ICLR 2020 paper "Reinforcement Learning Based Graph-to-Sequence Model for Natural Question Generation"
A summary of must-read papers for Neural Question Generation (NQG)