Aussie AI
Types of Unstructured Pruning
-
Book Excerpt from "Generative AI in C++"
-
by David Spuler, Ph.D.
Types of Unstructured Pruning
There are many variations of unstructured pruning algorithms. As we've seen above, magnitude pruning is a first-order method because it considers only the current value, whereas “movement proving” is a second-order method because it considers the rate of change during training. Also possible are third-order methods based on the second derivative, which can converge faster, but are computationally expensive.
All of these unstructured pruning methods have meta-parameters that are threshold values that alter the pruning characteristics. Magnitude pruning could also have different thresholds for positive and negative values. The magnitude pruning threshold is often set at a particular target sparsity level rather than a magnitude threshold. These meta-parameters may also change over time, starting with a startup phase of one or two epochs (“stabilization” or “warm-up”), then a low sparsity target at the start and increasing the level of pruning over time. This is called Gradual Magnitude Pruning (GMP).
The decision of how often to prune is also a meta-parameter of these algorithms. Theoretically, we could prune at every weight update, but that seems inefficient and everything could diverge quickly down to zero. Magnitude pruning is typically done using Gradual Magnitude Pruning (GMP), rather than one-shot magnitude pruning, which prunes only once per epoch. Pruning only once ever, at the end of the training phase, is not really training-based pruning, but is effectively post-training pruning.
• Next: • Up: Table of Contents |
The new AI programming book by Aussie AI co-founders:
Get your copy from Amazon: Generative AI in C++ |