Aussie AI

Difference-Squared Networks

  • Book Excerpt from "Generative AI in C++"
  • by David Spuler, Ph.D.

Difference-Squared Networks

Squaring the difference between two numbers is well-known in Euclidean distance calculations and statistical variance. This idea has been applied to neural networks as “diff-squared networks”. Some methods cited by other papers as “multiplication-free” model research compute a difference (subtraction), but then square it, which is technically still multiplication, but who's counting? There are bit tricks to compute square-root and inverse-square-root, so maybe someone has a trick for squaring with bitwise operators? Also, it isn't using multiplication by weights, so it's a distinct method.

Research papers on diff-squared networks:

  1. Xinlin Li, Mariana Parazeres, Adam Oberman, Alireza Ghaffari, Masoud Asgharian & Vahid Partovi Nia, 2023, EuclidNets: An Alternative Operation for Efficient Inference of Deep Learning Models, SN Computer Science, volume 4, 2023, https://link.springer.com/article/10.1007/s42979-023-01921-y (This uses the square of the difference, which is really still multiplication.)
  2. Xinlin Li, Mariana Parazeres, Adam Oberman, Alireza Ghaffari, Masoud Asgharian, Vahid Partovi Nia, 2022, EuclidNets: An Alternative Operation for Efficient Inference of Deep Learning Models, Dec 2022, https://arxiv.org/abs/2212.11803 (uses squares and Euclidean distances as weights)
  3. S. Fan, L. Liu, and Y. Luo. 2021, An alternative practice of tropical convolution to traditional convolutional neural networks, In 2021 The 5th International Conference on Compute and Data Analysis, pages 162–168, 2021, https://arxiv.org/abs/2103.02096 (Tropical arithmetic)
  4. Y. Luo and S. Fan. 2021, Min-max-plus neural networks, arXiv preprint arXiv:2102.06358, 2021, https://arxiv.org/abs/2102.06358 (Tropical arithmetic)
  5. Robert Tibshirani, 1996, Regression Shrinkage and Selection via the Lasso, Journal of the Royal Statistical Society. Series B (Methodological), Vol. 58, No. 1 (1996), pp. 267-288, https://www.jstor.org/stable/2346178 (Low-level mathematical paper from 1996 about the additive-squares method.)

For more research papers on diff-squared neural networks, see https://www.aussieai.com/research/zero-multiplication#squares.

 

Next:

Up: Table of Contents

Buy: Generative AI in C++: Coding Transformers and LLMs

Generative AI in C++ The new AI programming book by Aussie AI co-founders:
  • AI coding in C++
  • Transformer engine speedups
  • LLM models
  • Phone and desktop AI
  • Code examples
  • Research citations

Get your copy from Amazon: Generative AI in C++