Start your day with intelligence. Get The OODA Daily Pulse.
Imagine asking a sophisticated AI model for tips on making the perfect pizza, only to be told to use glue to help the cheese stick. Or watching it fumble through basic arithmetic problems that a middle schooler could solve with ease. These are the limitations and quirks of generative AI and the large language models (LLMs) that underpin them. They’re happening because AI models are running out of good training data, causing them to plateau. This is a cycle in innovation that repeats throughout history: For a long time, an almost undetectable amount of knowledge and craft builds up around an idea, like an invisible gas. Then, a spark. An explosion of innovation ensues but, of course, eventually stabilizes. This pattern is called an S-Curve. For example:
The AI revolution is following this curve. In a 1950 paper, Alan Turing was one of the first computer scientists to explore how to build a thinking machine, starting the slow buildup of knowledge. Seventy years later, the spark: A 2017 research paper, Attention Is All You Need, leads to OpenAI’s development of ChatGPT, which convincingly mimics human conversation, unleashing a global shock wave of innovation based upon generative AI technology.
Full opinion : The first wave of AI innovation is over. Here’s what comes next.