Aussie AI
Negative Skipping
-
Book Excerpt from "Generative AI in C++"
-
by David Spuler, Ph.D.
Negative Skipping
Negative skipping is not skipping of negative weights. Instead, negative skipping is an attempt to predict which vector dot product computations will be negative, and skip doing them. In models that use the RELU activation function, any negative results would be zero anyway if sent to RELU. Hence, negative skipping with RELU is a type of zero skipping.
Research papers on negative skipping:
- Duvindu Piyasena, Rukshan Wickramasinghe, Debdeep Paul, Siew Kei Lam, and Meiqing Wu. 2019. Reducing dynamic power in streaming CNN hardware accelerators by exploiting computational redundancies, Proceedings 29th International Conference on Field-Programmable Logic and Applications, FPL 2019 (9 2019), 354–359, https://ieeexplore.ieee.org/document/8891989 PDF: https://siewkeilam.github.io/ei-research-group/Paper/2019H-Duvindu-FPL.pdf (This is “negative skipping”, similar to zero-skipping, where cheap estimates avoid computations that would be negative, which would thereby be reduced to zero by RELU activation.)
- T. Ujiie, M. Hiromoto, and T. Sato. 2016. Approximated Prediction Strategy for Reducing Power Consumption of Convolutional Neural Network Processor, Conf. on Comp. Vision and Pattern Recog. Workshops (CVPRW), 870–876. https://ieeexplore.ieee.org/document/7789603 https://openaccess.thecvf.com/content_cvpr_2016_workshops/w14/papers/Ujiie_Approximated_Prediction_Strategy_CVPR_2016_paper.pdf (Does “negative skipping” by quickly approximating the value of a convolution to skip it entirely if expected to be negative.)
For more research on negative skipping, see also https://www.aussieai.com/research/zero-skipping#negative.
• Next: • Up: Table of Contents |
The new AI programming book by Aussie AI co-founders:
Get your copy from Amazon: Generative AI in C++ |