-
New York University
- New York, New York
Block or Report
Block or report aniket03
Contact GitHub support about this user’s behavior. Learn more about reporting abuse.
Report abuseStars
Language
Sort by: Recently starred
Cool Python features for machine learning that I used to be too afraid to use. Will be updated as I have more time / learn more.
pytorch implementation of "Get To The Point: Summarization with Pointer-Generator Networks"
The PyTorch implementation of fine-tuning the GPT-2(Generative Pre-trained Transformer 2) for dialogue generation.
Analyzing OOD detection in text.
A framework for training and evaluating AI models on a variety of openly available dialogue datasets.
NYU PSYCH-GA 3405.002 / DS-GS 3001.006 : Computational cognitive modeling
Train transformer language models with reinforcement learning.
Tutorials on implementing a few sequence-to-sequence (seq2seq) models with PyTorch and TorchText.
code for Putting Evaluation in Context: Contextual Embeddings Improve Machine Translation Evaluation
Code for the paper "Learning an Unreferenced Metric for Online Dialogue Evaluation", ACL 2020
EMNLP 2020: "Dialogue Response Ranking Training with Large-Scale Human Feedback Data"
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
Pretrain, finetune and deploy AI models on multiple GPUs, TPUs with zero code changes.
A Collection of Variational Autoencoders (VAE) in PyTorch.
An IPython notebook explaining the concepts of Variational Autoencoders and building one using Keras to generate new faces.
Temporal SimCLR - A Variation of SimCLR for Video Datasets
PyTorch Implementation of Fully Convolutional Networks. (Training code to reproduce the original result is available.)
Datasets, Transforms and Models specific to Computer Vision
A Python module for getting the GPU status from NVIDA GPUs using nvidia-smi programmically in Python
Pytorch based FasterRCNN for custom dataset
Python utilities for reading the STL-10 dataset: http:https://cs.stanford.edu/~acoates/stl10/
SimCLRv2 - Big Self-Supervised Models are Strong Semi-Supervised Learners
A smaller subset of 10 easily classified classes from Imagenet, and a little more French
A large scale study of Knowledge Distillation.