Stars
<⚡️> SuperAGI - A dev-first open source autonomous AI agent framework. Enabling developers to build, manage & run useful autonomous agents quickly and reliably.
Repository for the CommonLit Ease of Readability Corpus
Demonstration of accelerating GPT-J with DeepSpeed Inference and deploying on AWS SageMaker
API for the GPT-J language model 🦜. Including a FastAPI backend and a streamlit frontend
Codes for ACL-IJCNLP 2021 Paper "Zero-shot Fact Verification by Claim Generation"
finetunej / transformers
Forked from huggingface/transformers🤗Transformers: State-of-the-art Natural Language Processing for Pytorch and TensorFlow 2.0.
DeepCT and HDCT uses BERT to generate novel, context-aware bag-of-words term weights for documents and queries.
Extractive summarizer using BertSum as summarization model
MixText: Linguistically-Informed Interpolation of Hidden Space for Semi-Supervised Text Classification
Meta learning with BERT as a learner
Python package for AutoML on Tabular Data with Feature Engineering, Hyper-Parameters Tuning, Explanations and Automatic Documentation
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
Python implementation of TextRank algorithms ("textgraphs") for phrase extraction
UmBERTo: an Italian Language Model trained with Whole Word Masking.
Language model trained on wiki corpus (500M tokens) with fastai v1 acc>42.3% len(vocab)=60K
ALBERT model Pretraining and Fine Tuning using TF2.0
Cross-domain NER using cross-domain language modeling, code for ACL 2019 paper
Official implementations for various pre-training models of ERNIE-family, covering topics of Language Understanding & Generation, Multimodal Understanding & Generation, and beyond.
Code for paper Fine-tune BERT for Extractive Summarization
Implementation of <Improving Neural Question Generation Using Answer Separation> by Yanghoon Kim et al., AAAI 2019
BERT which stands for Bidirectional Encoder Representations from Transformations is the SOTA in Transfer Learning in NLP.