Aussie AI
12-Bit Quantization (INT12)
-
Book Excerpt from "Generative AI in C++"
-
by David Spuler, Ph.D.
12-Bit Quantization (INT12)
Research papers on 12-bit quantization:
- Markus Nagel, Mart van Baalen, Tijmen Blankevoort, Max Welling, 2019, Data-free quantization through weight equalization and bias correction, PDF: https://openaccess.thecvf.com/content_ICCV_2019/papers/Nagel_Data-Free_Quantization_Through_Weight_Equalization_and_Bias_Correction_ICCV_2019_paper.pdf (Evaluates INT5, INT6, INT8, INT10, INT12, and INT16.)
- Xishan Zhang1,2, Shaoli Liu, Rui Zhang, Chang Liu, Di Huang, Shiyi Zhou, Jiaming Guo, Qi Guo, Zidong Du, Tian Zhi, Yunji Chen, 2020, Fixed-Point Back-Propagation Training, CVPR 2020, PDF: https://openaccess.thecvf.com/content_CVPR_2020/papers/Zhang_Fixed-Point_Back-Propagation_Training_CVPR_2020_paper.pdf
See more papers on 12-bit quantization (INT12) at: https://www.aussieai.com/research/quantization#int12
• Next: • Up: Table of Contents |
The new AI programming book by Aussie AI co-founders:
Get your copy from Amazon: Generative AI in C++ |