Block or Report
Block or report linhtr
Contact GitHub support about this user’s behavior. Learn more about reporting abuse.
Report abuseStars
Language
Sort by: Recently starred
A simple wrapper class for extracting features(embedding) and comparing them using BERT in TensorFlow
🔡 Token level embeddings from BERT model on mxnet and gluonnlp
Code from the paper "Specializing Unsupervised Pretraining Models for Word-Level Semantic Similarity"
This is an example program illustrating BERTs masked language model.
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
Happy Transformer makes it easy to fine-tune and perform inference with NLP Transformer models.
State of the Art Natural Language Processing
Deep Keyphrase Extraction using BERT
BlueBERT, pre-trained on PubMed abstracts and clinical notes (MIMIC-III).
Evolution of word vectors from long, sparse, and 1-hot to short, dense, and context sensitive
💥 Fast State-of-the-Art Tokenizers optimized for Research and Production
Builds wordpiece(subword) vocabulary compatible for Google Research's BERT
A tool for extracting plain text from Wikipedia dumps
Code for using and evaluating SpanBERT.
MASS: Masked Sequence to Sequence Pre-training for Language Generation
Auxiliary GAN for WE post-specialisation
Exploring Neural Text Simplification
Lexical Simplification with Pretrained Encoders
ClinicalBERT: Modeling Clinical Notes and Predicting Hospital Readmission (CHIL 2020 Workshop)
repository for Publicly Available Clinical BERT Embeddings
🏄 Scalable embedding, reasoning, ranking for images and sentences with CLIP
Bioinformatics'2020: BioBERT: a pre-trained biomedical language representation model for biomedical text mining
A neural named entity recognition and multi-type normalization tool for biomedical text mining
System for Medical Concept Extraction and Linking