🤗 Optimum Intel: Accelerate inference with Intel optimization tools
-
Updated
Jul 11, 2024 - Jupyter Notebook
🤗 Optimum Intel: Accelerate inference with Intel optimization tools
DINOv1 implementation in Pytorch
A Compressed Stable Diffusion for Efficient Text-to-Image Generation [ECCV'24]
Awesome Knowledge Distillation
Official implementation of ⚡ Flash Diffusion ⚡: Accelerating Any Conditional Diffusion Model for Few Steps Image Generation
A place to evaluate public models
InstructPix2Pix with distilled diffusion models
The Biorefinery Simulation and Techno-Economic Analysis Modules; Life Cycle Assessment; Chemical Process Simulation Under Uncertainty
PaddleSlim is an open-source library for deep model compression and architecture search.
[CVPRW 2021] Rethinking Ensemble-Distillation for Semantic Segmentation Based Unsupervised Domain Adaptation
Using Knowledge Graph to Query Resume
Distibot (DISTIller roBOT) is a Python program for Raspberry Pi (Raspbian) to control a whole process of distillation
Raspberry PI and Arduino/ESP32 powered smart still controller system. Designed around the Still Spirits T-500 column and boiler, but can be easily added to any other gas or electric still with a dephlegmator.
[CVPR 2024] Asymmetric Masked Distillation for Pre-Training Small Foundation Models
The official repo for [AAAI 2024] "SimDistill: Simulated Multi-modal Distillation for BEV 3D Object Detection""
irresponsible innovation. Try now at https://chat.dev/
Insightface Keras implementation
Pytorch implementation of ACCV18 paper "Revisiting Distillation and Incremental Classifier Learning."
[ICLR 2022] Code for Graph-less Neural Networks: Teaching Old MLPs New Tricks via Distillation (GLNN)
Official code for our ECCV'22 paper "A Fast Knowledge Distillation Framework for Visual Recognition"
Add a description, image, and links to the distillation topic page so that developers can more easily learn about it.
To associate your repository with the distillation topic, visit your repo's landing page and select "manage topics."