Aussie AI
ELU Activation Function
-
Book Excerpt from "Generative AI in C++"
-
by David Spuler, Ph.D.
ELU Activation Function
The ELU activation function was first designed by Clevert, Unterthiner & Hochreiter (2016). The ELU function is somewhat related to the RELU function, but ELU's output can go negative for input values below zero. It also has an extra hyperparameter, alpha, to give multiple versions of ELU, where the alpha parameter controls how fast it goes negative. Here's some example C++ code of a very basic ELU implementation according to the paper:
float aussie_ELU_basic(float x, float alpha_hyperparam) { // Basic ELU activation (inefficient) // ELU = x if x > 0 .0 // = alpha * (exp(x) - 1) if x <= 0.0 if (x <= 0.0) return alpha_hyperparam * (expf(x) - 1.0f); return x; // x if x > 0.0 }
• Next: • Up: Table of Contents |
The new AI programming book by Aussie AI co-founders:
Get your copy from Amazon: Generative AI in C++ |