Aussie AI
Vector Dot Product
-
Book Excerpt from "Generative AI in C++"
-
by David Spuler, Ph.D.
Vector Dot Product
Vector dot product is the most basic algorithm in an AI engine. All of the tensor operations and matrix multiplications break down into instances of a dot product calculation.
The dot product is so-named because its mathematical notation is a dot. It is also known as the “scalar product” because its result is a scalar (single number), rather than a vector.
The vector dot product takes two vectors as input, and computes a single float
number.
The algorithm is a product of the elements of each vector, added together.
Here's the code:
float aussie_vecdot_basic(float v1[], float v2[], int n) { float sum = 0.0; for (int i = 0; i < n; i++) { sum += v1[i] * v2[i]; } return sum; }
Properties of the dot product include:
- Two vectors as input.
- Scalar output (single number).
- Can be positive or negative.
- Is zero if either vector is all zeros.
- Can also be zero for two non-zero vectors (e.g. if the vectors are “perpendicular” in 2-D or 3-D space).
- Has a physical meaning related to the “angle” between the two vectors.
- Is an integer if both vectors contain integers.
- Dot product of a vector with itself is the square of the vector's magnitude (equivalently, the vector's L2-squared norm).
- Is very slooow. Dot product-based operations inside matrices and tensors are the main culprit for AI needing all those GPUs.
The dot product differs from the “vector product” of two vectors (also called “cross product”) that returns a vector, and is a completely different mathematical operation. The vector cross product is interesting mathematically in that it computes a vector perpendicular in 3 dimensions, but it's not very useful for AI inference. The dot product is where the action's at in big tensors.
• Next: • Up: Table of Contents |
The new AI programming book by Aussie AI co-founders:
Get your copy from Amazon: Generative AI in C++ |