From 59824ac87b1f6aa4334ba44769254efb9d7856b3 Mon Sep 17 00:00:00 2001 From: Guy Jacob Date: Mon, 11 Feb 2019 14:25:23 +0200 Subject: [PATCH] Update README.md --- README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/README.md b/README.md index 77ed092dd..863ec065f 100755 --- a/README.md +++ b/README.md @@ -152,7 +152,7 @@ Beware. - Group Lasso an group variance regularization * **Quantization** - Automatic mechanism to transform existing models to quantized versions, with customizable bit-width configuration for different layers. No need to re-write the model for different quantization methods. - - Post-training quantization of trained full-precision models + - Post-training quantization of trained full-precision models, dynamic and static (statistics-based) - Support for [quantization-aware training](https://nervanasystems.github.io/distiller/quantization/index.html#quantization-aware-training) in the loop * **Knowledge distillation** - Training with [knowledge distillation](https://nervanasystems.github.io/distiller/knowledge_distillation/index.html), in conjunction with the other available pruning / regularization / quantization methods.