Aussie AI

Our Technology

Consumer AI Apps Platform

Aussie AI develops a consumer applications platform with advanced generative AI technology with the goal of creating an application layer that permits rapid development of many types of user applications. This allows us to offer a selection of consumer AI apps to our users. Our strategy is informed by our view that humans are the top of the AI stack.

How Does It Work?

Who knows? We sure don't. Nobody in the AI industry really understands how ChatGPT comes up with its answers. And we're not even joking.

But we can still use it! Our grand plans to raise a few billion dollars in funding have so far failed, so we haven't trained our own frontier model. Instead, we rely on others to build the Large Language Models (LLMs), and are building our own "layer" above this.

The Aussie AI consumer app platform is a set of features in a layer above those amazing LLMs. Some of the main features of our consumer AI platform include the following features:

Commercial and open-source models. We use a variety of LLMs, some of which are hosted on our own servers, and some of which are accessed remotely by API, such as the OpenAI API to access the ChatGPT-related models. Several types of LLMs are available in "open source" for free, such as from Meta, and we use these models via local hosting and third-party commercial hosters.

Wrapper architectures. Some of our simplest apps take the user's query inputs, wraps it in a prompt template, and sends it off to an LLM. This type of architecture is wonderfully easy to do, makes for great demos to show venture capitalists, but is a little harder to do at scale. It also isn't really a great long-term plan if that's all you're going to do with AI.

Inference optimization. We stay up-to-date on all of the various inference optimization research, which is a fancy way of saying how to run AIs fast. This allows us to optimize our own hosted engines, and to use efficiency features in third-party APIs, such as cached tokens and batched AI requests.

Heuristics. Did you know that software used to work before AI? And it still does. I'd rather do my tax return with TurboTax than ChatGPT. Here's the other thing about old-school non-LLM software: it runs like 100 times faster. Hence, one of the things we do to have cost-effective consumer AI apps, is sometimes we don't use AI (shock, horror). These are called "heuristics" or "rule-based engines" or lots of other weird names.

Multi-step reasoning. The latest thing that hasn't been of any interest to ordinary people, but fascinating for AI engineers, is that LLMs are smarter if they think longer. I mean, imagine it being a good idea to slow down and actually think about what you're writing? Who knew? But it's the big trend in the AI industry and it's called multi-step reasoning.

Training. It's easy to make a small fortune in AI: start with a large fortune, and then do training. We don't really plan to buy a truckload of GPUs and then pretend we know how to make it profitable one day. Instead, we're focused on inference, which is the new new reasoning.

For AI Developers

Our consumer app platform is currently proprietary and used internally, but that may change in the future. In the meantime, we offer a number of free resources to C++ software engineers and other AI professionals:

And now, thank you for listening, and we're back to our regular programming.

Advanced C++ Coding for AI Developers

Generative AI in C++ The new Generative AI programming book by Aussie AI co-founders:
  • Generative AI coding in C++
  • Transformer engines & LLM models
  • Phone and desktop AI
  • Code examples
  • Research citations
  • Full text online: Table of Contents
  • Buy your copy: Generative AI in C++