Aussie AI
Batch Normalization
-
Book Excerpt from "Generative AI in C++"
-
by David Spuler, Ph.D.
Batch Normalization
Batch normalization or “BatchNorm” was introduced to generalize simple normalization with some extra parameters called the “gain” and “bias” factors. The gain factor is a multiplicative scaling factor, often called “lambda” or “alpha.” The bias factor is an additive factor, often called “beta.”
There is also a third factor, usually called “epsilon,” which has a small magnitude and whose purpose is more about practicality than intelligence. It is used mainly to avoid pathological cases such as division-by-zero or a floating-point overflow that could result from division by a tiny divisor.
• Next: • Up: Table of Contents |
The new AI programming book by Aussie AI co-founders:
Get your copy from Amazon: Generative AI in C++ |