A framework to train a ResUNet architecture, quantize, compile and execute it on an FPGA.
-
Updated
Jun 23, 2023 - Jupyter Notebook
A framework to train a ResUNet architecture, quantize, compile and execute it on an FPGA.
Research experiments archive for post-training quantization with TensorRT. Submitted and Accepted to IEEE EDGE 2024
Creating Snapdragon neural processing engine environment to convert protobuf(pb) files to dlc(deep learning container) files as well as use snpe for Quantizing neural networks. Mobile app development link https://github.com/anshumax/mobilenn
Post-Training quantization perfomed on the model trained with CLIC dataset.
Quantization for Object Detection in Tensorflow 2.x
EfficientNetV2 (Efficientnetv2-b2) and quantization int8 and fp32 (QAT and PTQ) on CK+ dataset . fine-tuning, augmentation, solving imbalanced dataset, etc.
Post post-training-quantization (PTQ) method for improving LLMs. Unofficial implementation of https://arxiv.org/abs/2309.02784
Comprehensive study on the quantization of various CNN models, employing techniques such as Post-Training Quantization and Quantization Aware Training (QAT).
The repository discusses a research work published on MDPI Sensors and provides details about the project.
Low-bit (2/4/8/16) Post Training Quantization for ResNet20
Model Quantization with Pytorch, Tensorflow & Larq
Generating tensorrt model using onnx
quantization example for pqt & qat
Implementation of EPTQ - an Enhanced Post-Training Quantization algorithm for DNN compression
This sample shows how to convert TensorFlow model to OpenVINO IR model and how to quantize OpenVINO model.
Post-training quantization on Nvidia Nemo ASR model
[CAAI AIR'24] Minimize Quantization Output Error with Bias Compensation
Improved the performance of 8-bit PTQ4DM expecially on FID.
Pytorch implementation of our paper accepted by ECCV 2022-- Fine-grained Data Distribution Alignment for Post-Training Quantization
Add a description, image, and links to the post-training-quantization topic page so that developers can more easily learn about it.
To associate your repository with the post-training-quantization topic, visit your repo's landing page and select "manage topics."