Aussie AI

Dynamic NAS

  • Book Excerpt from "Generative AI in C++"
  • by David Spuler, Ph.D.

Dynamic NAS is the use of NAS-like approaches to find optimal hyper-parameters for the various dynamic inference options. Every type of adaptive inference or dynamic pruning has meta-parameters such as numeric threshold values or a choice of multiple decision metrics. Deciding on the best option from all of that shemozzle is the idea of Dynamic NAS research.

Dynamic NAS is not yet a mainstream use of NAS searching, but there are some research papers starting to appear on this extension. NAS has traditionally been applied to finding optimal meta-parameters for models without regard to dynamic approaches. This emerging area of research aims to consider the hyperparameters of dynamic inference optimizations as part of searching the problem space for an optimal model.

Research papers on dynamic NAS:

  1. Matteo Gambella, Manuel Roveri, 2023, EDANAS: Adaptive Neural Architecture Search for Early Exit Neural Networks, 2023 International Joint Conference on Neural Networks (IJCNN), pp.1-8, 2023. https://ieeexplore.ieee.org/document/10191876 (NAS applied to early-exit dynamic inference.)
  2. Chakkrit Termritthikun, Yeshi Jamtsho, Jirarat Ieamsaard, Paisarn Muneesawang, Ivan Lee, 2021, EEEA-Net: An Early Exit Evolutionary Neural Architecture Search, Engineering Applications of Artificial Intelligence Volume 104, September 2021, 104397, https://www.sciencedirect.com/science/article/abs/pii/S0952197621002451, https://arxiv.org/abs/2108.06156, Code: https://github.com/chakkritte/EEEA-Net (A 2021 paper on NAS applied to early-exit.)
  3. KT Chitty-Venkata, Y Bian, M Emani, V Vishwanath, Jan 2023 Differentiable Neural Architecture, Mixed Precision and Accelerator Co-search, IEEE Access, DOI:10.1109/ACCESS.2023.3320133, PDF: https://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=10266308
  4. Linnan Wang, Chenhan Yu, Satish Salian, Slawomir Kierat, Szymon Migacz, Alex Fit Florea, 2022, GPUNet: Searching the Deployable Convolution Neural Networks for GPUs, https://arxiv.org/abs/2205.00841 (A general NAS system that could be applied statically or dynamically.)

For more research on dynamic NAS, see also https://www.aussieai.com/research/nas#dynamic.

 

Next:

Up: Table of Contents

Buy: Generative AI in C++: Coding Transformers and LLMs

Generative AI in C++ The new AI programming book by Aussie AI co-founders:
  • AI coding in C++
  • Transformer engine speedups
  • LLM models
  • Phone and desktop AI
  • Code examples
  • Research citations

Get your copy from Amazon: Generative AI in C++