Block or Report
Block or report hanyxu
Contact GitHub support about this user’s behavior. Learn more about reporting abuse.
Report abuseStars
Language
Sort by: Recently starred
Implements quantized distillation. Code for our paper "Model compression via distillation and quantization"
LQ-Nets: Learned Quantization for Highly Accurate and Compact Deep Neural Networks
[WACV2022] Official Code for the "DAQ: Channel-Wise Distribution-Aware Quantization for Deep Image Super-Resolution Networks"
Neural Network Quantization With Fractional Bit-widths
A simple Python library to make chained attributes possible.
[ECCV 2022] Patch Similarity Aware Data-Free Quantization for Vision Transformers
Nonuniform-to-Uniform Quantization: Towards Accurate Quantization via Generalized Straight-Through Estimation. In CVPR 2022.
Learning Sparse Neural Networks through L0 regularization
Awesome Knowledge-Distillation. 分类整理的知识蒸馏paper(2014-2021)。
A curated list of neural network pruning resources.
Code for the NeurIPS 2022 paper "Optimal Brain Compression: A Framework for Accurate Post-Training Quantization and Pruning".
A treasure chest for visual classification and recognition powered by PaddlePaddle
Neural Network Distiller by Intel AI Lab: a Python package for neural network compression research. https://intellabs.github.io/distiller
DeepSpeed is a deep learning optimization library that makes distributed training and inference easy, efficient, and effective.
pytorch implementation of "Differentiable Soft Quantization: Bridging Full-Precision and Low-Bit Neural Networks"
周志华《机器学习》又称西瓜书是一本较为全面的书籍,书中详细介绍了机器学习领域不同类型的算法(例如:监督学习、无监督学习、半监督学习、强化学习、集成降维、特征选择等),记录了本人在学习过程中的理解思路与扩展知识点,希望对新人阅读西瓜书有所帮助!
A list of papers, docs, codes about model quantization. This repo is aimed to provide the info for model quantization research, we are continuously improving the project. Welcome to PR the works (p…