Aussie AI
AI PCs
-
Book Excerpt from "Generative AI in C++"
-
by David Spuler, Ph.D.
AI PCs
The main research area in relation to “AI PCs” is optimization of inference algorithms, so that the models can run fast enough. This includes execution of AI inference on CPU-only PCs and low-end GPU-based PCs that are available. Training on PCs is a lower priority, because this can always be done offline in the cloud.
A desktop PC or laptop is more capable than a phone, so some of the problems with phones running AI inference are less problematic on a PC. Most obviously, a PC can have a decent GPU, which can then be used by AI engines (assuming you turn off your Minecraft server). Concerns about CPU usage, over-heating, and battery depletion are also less problematic for a PC than on a phone.
However, execution speed on a PC is still rather sluggish for large models, even on multi-thousand dollar PCs with powerful GPUs, so there is much research still to be done on optimization of inference. Large models are where the action is found in terms of AI functionality, so it may be that software developers are still using cloud server AI for some time to come. And certainly, training and fine-tuning workloads seem less likely to move down onto desktop PCs. However, “AI PCs” are already becoming available for everyday users and developers alike.
• Next: • Up: Table of Contents |
The new AI programming book by Aussie AI co-founders:
Get your copy from Amazon: Generative AI in C++ |