![google logo](https://raw.githubusercontent.com/github/explore/80688e429a7d4ef2fca1e82350fe8e3517d3494d/topics/google/google.png)
-
Carnegie Mellon University
- Pittsburgh, PA
- zorazrw.github.io
Highlights
- Pro
Block or Report
Block or report zorazrw
Contact GitHub support about this user’s behavior. Learn more about reporting abuse.
Report abuseLanguage
Sort by: Recently starred
Starred repositories
Data science interview questions and answers
Official code for VisProg (CVPR 2023 Best Paper!)
Build, customize and control you own LLMs. From data pre-processing to fine-tuning, xTuring provides an easy way to personalize open-source LLMs. Join our discord community: https://discord.gg/TgHX…
Rigourous evaluation of LLM-synthesized code - NeurIPS 2023
Official JAX implementation of MAGVIT: Masked Generative Video Transformer
Home of StarCoder: fine-tuning & inference!
Dense Passage Retriever - is a set of tools and models for open domain Q&A task.
Data and code for "DocPrompting: Generating Code by Retrieving the Docs" @ICLR 2023
A list of ethics related resources for researchers and practitioners of Natural Language Processing and Computational Linguistics
[ICLR 2023] Code for the paper "Binding Language Models in Symbolic Languages"
[EMNLP'23] Execution-Based Evaluation for Open Domain Code Generation
[EACL'23] MCoNaLa: A Benchmark for Code Generation from Multiple Natural Languages
CodeGen is a family of open-source model for program synthesis. Trained on TPU-v4. Competitive with OpenAI Codex.
CodeGeeX: An Open Multilingual Code Generation Model (KDD 2023)
ACL 2022: BRIO: Bringing Order to Abstractive Summarization
🤗 Evaluate: A library for easily evaluating machine learning models and datasets.
Convert Machine Learning Code Between Frameworks
Collection of advice for prospective and current PhD students
Code for the paper "Evaluating Large Language Models Trained on Code"
A framework for training and evaluating AI models on a variety of openly available dialogue datasets.
The unified platform for data-related resources.
A simple tool to update bib entries with their official information (e.g., DBLP or the ACL anthology).
TUTA and ForTaP for Structure-Aware and Numerical-Reasoning-Aware Table Pre-Training
Starred topics
![google logo](https://raw.githubusercontent.com/github/explore/80688e429a7d4ef2fca1e82350fe8e3517d3494d/topics/google/google.png)