Will AI Oversaturate? (2000s Calling… They Want Their Hype Cycle Back)

If the “XML hype cycle” doesn’t spark vivid memories, go ask someone who’s been in IT for 25 years. 🙂

XML was to be the solution to all of the world’s data problems. C-Suite execs who wouldn’t know a parser from a parsnip we getting routinely quoted in press releases, echoing the praises of their foresight in choice of document format. Everyone talked about how much easier it would be to interchange data, all while promoting their own proprietary markup vocabulary. Companies embraced XML as a front-page marketing plank.

Generative AI is similarly getting touted as a universal solution, but more on that in a minute.

It didn’t take long for trouble to creep into the XML kingdom. Some core ideas got left behind as an overly-complicated solution for combining vocabularies effectively compromised the foundations, and nearly all subsequent work became similarly over-complicated. On top of this, “standards” processes gone awry ended up producing an alphabet soup of specifications around Web Services, something arguably already outside of the sweet spot that XML had targeted in the first place. Fortunately, these adjuncts didn’t stick around long.

There’s no central coordinating body for generative AI, as there was for XML; the foundations are more diffuse. Though a similar technology risk might arise from regulation. Legislatures are famously slow at adapting to change (see, for example, Social Media) and now the changes are coming so fast that even hard-core technologists are having trouble keeping up. So the odds of some kind of regulation coming are about 100%, and the odds of such rules bringing unintended complexification is almost as high. Indeed, some may argue that this is a good thing. 🙂

One obvious difference, over which a few readers are probably shouting at their screens just about now, is the scale of the technology. Generative AI is many orders of magnitude bigger than XML ever was. But the shape of the curve looks similar.

In the late 90s, I spent a lot of hours reading through XML specifications, brushing up on low-level technology basics like parsing, grammars, functional style, and S-Expressions. But also the big picture stuff, like exploring how applications could use the technology. Here in the age of generative AI, I’m following a similar course. I’ve studied optimizers and coded neural networks, and I’m poring over the internals of the Transformer architecture on which ChatGPT and similar systems are based. And at the same time exploring and prototyping big-picture applications.

Hype cycles inevitably peak, after which the real work begins. Not every hype-fueled company or product embrace will be with us on the long road to the plateau of productivity, but that’s OK. One thing’s for sure—the world will never be the same.

This goes double for the impact we’ll see on hiring and promotion. To keep on top of these trends, consider joining my council of problem solvers.

Originally posted on LinkedIn. 100% free-range human written.

Scroll to Top