Start your day with intelligence. Get The OODA Daily Pulse.

Home > Briefs > Technology > AI isn’t getting smarter, it’s getting more power hungry – and expensive

AI isn’t getting smarter, it’s getting more power hungry – and expensive

It’s well known that artificial intelligence models such as GPT-5.2 improve their performance on benchmark scores as more compute is added. It’s a phenomenon known as “scaling laws,” the AI rule of thumb that says accuracy improves in proportion to computing power. But, how much effect does computing power have relative to other things that OpenAI, Google, and others bring — such as better algorithms or different data? To find the answer, researchers Matthias Mertens and colleagues of the Massachusetts Institute of Technology examined data for 809 large language model AI programs. They estimated how much of each benchmark’s performance was attributable to the amount of computing power used to train the models.

Full report : Frontier models such as OpenAI’s GPT depend mostly on increasing computing power rather than smarter algorithms, according to a new MIT report. Here’s why that matters.