-
Carnegie Mellon University
- Pittsburgh
- https://aashiqmuhamed.github.io/
- @AashiqMuhamed
Highlights
- Pro
Block or Report
Block or report aashiqmuhamed
Contact GitHub support about this user’s behavior. Learn more about reporting abuse.
Report abuseLists (6)
Sort Name ascending (A-Z)
Language
Sort by: Recently starred
Starred repositories
Python 3.8+ toolbox for submitting jobs to Slurm
Evaluation framework for your Retrieval Augmented Generation (RAG) pipelines
Code to reproduce the experiments in the paper Training on the Test Task Confounds Evaluation and Emergence.
Influence Functions with (Eigenvalue-corrected) Kronecker-Factored Approximate Curvature
VLHub is a PyTorch framework for training and evaluating vision-language models.
The RedPajama-Data repository contains code for preparing large datasets for training large language models.
Official implementation of Goldfish Loss: Mitigating Memorization in Generative LLMs
Official code for the paper "Does CLIP's Generalization Performance Mainly Stem from High Train-Test Similarity?" (ICLR 2024)
Generalization Beyond Data Imbalance: A Controlled Study on CLIP for Transferable Insights
Official implementation of our paper "Finetuned Multimodal Language Models are High-Quality Image-Text Data Filters".
Custom data types and layouts for training and inference
ICLR2024 Spotlight: curation/training code, metadata, distribution and pre-trained models for MetaCLIP; CVPR 2024: MoDE: CLIP Data Experts via Clustering
Improving Alignment and Robustness with Circuit Breakers
GRASS: Compute Efficient Low-Memory LLM Training with Structured Sparse Gradients
Codebase for decoding compressed trust.
A framework for serving and evaluating LLM routers - save LLM costs without compromising quality!
Data and code for the preprint "In-Context Learning with Long-Context Models: An In-Depth Exploration"
In-BoXBART: Get Instructions into Biomedical Multi-task Learning
[NeurIPS 2023] LLM-Pruner: On the Structural Pruning of Large Language Models. Support LLaMA, Llama-2, BLOOM, Vicuna, Baichuan, etc.