Aussie AI
SiLUSigmoid Activation Function
-
Book Excerpt from "Generative AI in C++"
-
by David Spuler, Ph.D.
SiLU/Sigmoid Activation Function
The SiLU activation function uses the product of x and the sigmoid function of x. This is also equivalent to the Swish activation function, with its parameter beta set to 1. Here is the basic SiLU activation function in C++:
float aussie_SiLU_basic(float x) // Basic SiLU (inefficient) { // Sigmoid = 1 + e^(-x) // SiLU = x * (1 + e^(-x) ) // = x * 1.0 / (1.0 + expf(-x)); return x / (1.0f + expf(-x)); }
The SiLU function is inefficient by default. Its speed can be improved via table lookups and approximations.
• Next: • Up: Table of Contents |
The new AI programming book by Aussie AI co-founders:
Get your copy from Amazon: Generative AI in C++ |