Aussie AI
Context and Conversations
-
Book Excerpt from "Generative AI in C++"
-
by David Spuler, Ph.D.
Context and Conversations
If you're creating a chatbot, you create a UI that accepts the user's inputs, sends it off to the AI engine via the network, and then outputs the answer back to the user. They go back-and-forth with a stream of requests and responses, thereby creating a conversation.
Oh, really?
What's missing is the “context” of every request that's part of the conversation. You cannot just send the user's latest response off to the engine, because:
AI engines are stateless.
Hence, the default AI engine doesn't remember what else it's already said. Maybe it's because the GPUs have stolen all their RAM.
Instead, it's up to you as the programmer to store and re-submit the entire conversational history with every request. This is a wonderful situation when you're paying per input token.
It seems like the API vendors could handle context management for you, but I'm not aware of any that do yet. The OpenAI API provides helpful ways to structure the historical context in a request, but doesn't yet store it for you.
• Next: • Up: Table of Contents |
The new AI programming book by Aussie AI co-founders:
Get your copy from Amazon: Generative AI in C++ |