Stars
Pipeline components for real-time phylodynamic analysis
Maximum likelihood inference of time stamped phylogenies and ancestral reconstruction
Viral genome alignment, mutation calling, clade assignment, quality checks and phylogenetic placement
OATML-Markslab / EVE
Forked from OATML/EVEOfficial repository for the paper "Large-scale clinical interpretation of genetic variants using evolutionary data and deep learning". Joint collaboration between the Marks lab and the OATML group.
Official repository for the paper "Learning from pre-pandemic data to forecast viral antibody escape"
ProtTrans is providing state of the art pretrained language models for proteins. ProtTrans was trained on thousands of GPUs from Summit and hundreds of Google TPUs using Transformers Models.
VESPA is a simple, yet powerful Single Amino Acid Variant (SAV) effect predictor based on embeddings of the Protein Language Model ProtT5.
A pytorch implementation of the vector quantized variational autoencoder (https://arxiv.org/abs/1711.00937)
Pyro models of SARS-CoV-2 variants
Implementation of Denoising Diffusion Probabilistic Model in Pytorch
Official code for Score-Based Generative Modeling through Stochastic Differential Equations (ICLR 2021, Oral)
Differentiable ODE solvers with full GPU support and O(1)-memory backpropagation.
Time-resolved metagenomic sequencing of Lenski's long-term evolution experiment with Escherichia coli
Implementation of ICML 22 Paper: Scaling Structured Inference with Randomization
Pretrained language model and its related optimization techniques developed by Huawei Noah's Ark Lab.
Structured state space sequence models
Flexible components pairing 🤗 Transformers with ⚡ Pytorch Lightning
Open source code for AlphaFold.
Nextstrain build for novel coronavirus SARS-CoV-2
Datasets for https://github.com/nextstrain/nextclade
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
MMseqs2: ultra fast and sensitive search and clustering suite