Aussie AI

The Market for AI

  • Book Excerpt from "Generative AI in C++"
  • by David Spuler, Ph.D.

The Market for AI

Here are some future-looking thoughts about what the market for AI may look like. It seems likely that C++ programmers will be required for a little while longer.

It's a marathon, not a sprint. Consumers may continue to adopt genAI quickly, but that's not the most likely case for businesses. Whereas genAI is a hot topic in boardrooms, most business are still trying to find their feet in the area, with only exploratory projects launching. Small businesses and professionals (e.g. doctor's offices) will take years to adopt genAI, and larger enterprises will take even longer. There will be some early projects, sure, but the bulk of the B2B AI market will evolve more slowly. Projections for the B2B side of AI are over many years, even decades, with high CAGR. We've already seen this in the business adoption of cloud architectures, which is still ongoing, despite having been running since the early 2000's. The B2B AI market is likely to sustain very strong growth through 2030 and probably even into the 2040s and beyond.

B2B market opportunity trumps B2C. The massive ramp-up of consumer engagement with ChatGPT has made the consumer side seem hot. However, it's actually more likely to be the business side that makes more money (as usual). Predictions of the billions, maybe trillions, of dollars of benefit to economies through full AI integration into businesses, dwarf the predictions for consumer opportunities.

Training is the big B2B market? Early wisdom was that the high cost of training and fine-tuning would far exceed inference costs. This contention is somewhat in dispute, with some pundits saying that the sheer number of users will push inference ahead of training. Another factor is the trend toward using someone else's pre-trained LLM, whether it's GPT via the OpenAI API or the open source Llama models. Hence, there's definitely more inference than training in B2C projects, and it may also be taking over on the B2B side.

Fine-Tuning vs RAG. Most business AI projects will involve enhancing the model using proprietary data that the business owns. For example, a support chatbot has to learn information on the company's products, or an internal HR chatbot needs to use internal policy documents. There are two main ways to do this: fine-tuning or Retrieval-Augmented Generation (RAG). Current training and fine-tuning methods take a long time, need a lot of GPUs, and cost a great deal. However, RAG is becoming widely used to avoid the cost of fine-tuning.

Inference vs Training in the B2C market. Even the B2C genAI bots need continuous training and fine-tuning, to keep up with current events, so there will also be significant training cost (or RAG costs) in the B2C market. However, with millions of users for B2C apps, the cost of inference should overshadow training costs in the long run.

 

Next:

Up: Table of Contents

Buy: Generative AI in C++: Coding Transformers and LLMs

Generative AI in C++ The new AI programming book by Aussie AI co-founders:
  • AI coding in C++
  • Transformer engine speedups
  • LLM models
  • Phone and desktop AI
  • Code examples
  • Research citations

Get your copy from Amazon: Generative AI in C++