Stars
🦜🔗 Build context-aware reasoning applications
Source code for paper: HiNet: Novel Multi-Scenario & Multi-Task Learning with Hierarchical Information Extraction
The simplest, fastest repository for training/finetuning medium-sized GPTs.
Easy-to-use,Modular and Extendible package of deep-learning based CTR models .
Pre-trained Chinese ELECTRA(中文ELECTRA预训练模型)
State-of-the-Art Text Embeddings
A developing recommender system in tensorflow2. Algorithm: UserCF, ItemCF, LFM, SLIM, GMF, MLP, NeuMF, FM, DeepFM, MKR, RippleNet, KGCN and so on.
TensorFlow code and pre-trained models for BERT
A PyTorch-based knowledge distillation toolkit for natural language processing
Pretrained language model and its related optimization techniques developed by Huawei Noah's Ark Lab.
Implementation of BERT that could load official pre-trained models for feature extraction and prediction
all kinds of text classification models and more with deep learning