How LLM works. How it’s going to work

Watch this space. This is a first glimpse at how the new generation of AI technology including ChatGPT is going to interact with existing data platforms.

1. An AI coordinator interfaces with squishy humans, accepting a request and figuring out how to respond.
2. In so doing it identifies places where it needs to gather additional facts; writes and submits specific queries to a system-of-record
3. With realtime responses in hand, the AI formats a response and returns it.

To some extent, an AI model can ‘learn’ facts about the world through training, but this process cannot replace the original knowledge sources it was trained from. And even the fanciest models won’t be able to be trained with realtime business data. There will always be a need for a definitive system of record. The shapes of data needed in these systems will increasingly be multi-modal and dare I say, ad-hoc.

Which is why you need to be looking TODAY at how well you are able to accept, curate, organize, combine, and query complex data. The exact contours of how AI systems and data platforms will converse are hard to predict, so you need a level of agility in your platform, to be ready for anything.

As it happens, MarkLogic is an excellent tool for this. IMHO its Optic API forms a solid foundation for building hybrid AI platforms, just as Wolfram describes in the linked article. The just-released MarkLogic 11 makes it easier than ever to get started. Check it out.

(P.S. Wolfram Alpha is amazing. Imagine that kind of power, but over all the data that matters to your business…)

Originally posted on LinkedIn. 100% free-range human written.