Aussie AI

ELU Activation Function

  • Book Excerpt from "Generative AI in C++"
  • by David Spuler, Ph.D.

ELU Activation Function

The ELU activation function was first designed by Clevert, Unterthiner & Hochreiter (2016). The ELU function is somewhat related to the RELU function, but ELU's output can go negative for input values below zero. It also has an extra hyperparameter, alpha, to give multiple versions of ELU, where the alpha parameter controls how fast it goes negative. Here's some example C++ code of a very basic ELU implementation according to the paper:

    float aussie_ELU_basic(float x, float alpha_hyperparam)
    {
        // Basic ELU activation (inefficient)
        // ELU = x  if x > 0 .0
        //     = alpha * (exp(x) - 1) if x <= 0.0
        if (x <= 0.0) 
            return alpha_hyperparam * (expf(x) - 1.0f);
        return x;  // x if x > 0.0
    }

 

Next:

Up: Table of Contents

Buy: Generative AI in C++: Coding Transformers and LLMs

Generative AI in C++ The new AI programming book by Aussie AI co-founders:
  • AI coding in C++
  • Transformer engine speedups
  • LLM models
  • Phone and desktop AI
  • Code examples
  • Research citations

Get your copy from Amazon: Generative AI in C++