Stars
Big Data's open seminars: An Interactive Introduction to Reinforcement Learning
STAPLER (Shared TCR And Peptide Language bidirectional Encoder Representations from transformers) is a language model that uses a joint TCRab-peptide input to predict TCRab-peptide specificity.
MHC-peptide class II interaction prediction, binding, presentation
Large language modeling applied to T-cell receptor (TCR) sequences.
The simplest, fastest repository for training/finetuning medium-sized GPTs.
Code for the paper "Language Models are Unsupervised Multitask Learners"
A minimal PyTorch re-implementation of the OpenAI GPT (Generative Pretrained Transformer) training