Aussie AI

Binary Elementwise Tensor Operations

  • Book Excerpt from "Generative AI in C++"
  • by David Spuler, Ph.D.

Binary Elementwise Tensor Operations

Adding two matrices means simply adding each pair of elements in the matrix, which only works if the two matrices have the same size and shape. The same idea generalizes to the addition of tensor elements of two tensors with the same size (i.e. all three dimensions are the same). Hence, we can do element-wise binary arithmetic on each element in two tensors to create a third tensor of the same size:

  • Addition or subtraction
  • Multiplication or division
  • Maximum or minimum

Note that element-wise multiplication of tensor elements is not “tensor multiplication” in the same way that matrix multiplication isn't just paired multiplications of the elements in two matrices. Such an element-wise multiplication is called the “Hadamard product” of matrices, and is so useless that I don't think I was ever taught that in High School. The Hadamard product is not what is used by AI inference computations, but I've seen a few research papers where it was proposed as an optimization (probably unsuccessfully). Matrix multiplication is more complex, with its row-by-column vector dot product multiplications, and so is generalizing that to tensors.

That's how we get to “tensor product” of two tensors. It's really just nested loops doing matrix multiplications on slices of each tensor. And then matrix multiplications are just nested loops doing vector dot products. Like I said, tensors are just three-dimensional arrays doing multiplication and addition.

 

Next:

Up: Table of Contents

Buy: Generative AI in C++: Coding Transformers and LLMs

Generative AI in C++ The new AI programming book by Aussie AI co-founders:
  • AI coding in C++
  • Transformer engine speedups
  • LLM models
  • Phone and desktop AI
  • Code examples
  • Research citations

Get your copy from Amazon: Generative AI in C++