One question that ChatGPT can’t quite answer: how much energy do you consume? “As an AI language model, I don’t have a physical presence or directly consume energy,” it’ll say, or: “The energy consumption associated with my operations is primarily related to the servers and infrastructure used to host and run the model.” Google’s Bard is even more audacious. “My carbon footprint is zero,” it claims. Asked about the energy that is consumed in its creation and training, it responds: “not publicly known”. AI programs can seem incorporeal. But they are powered by networks of servers in data centers around the world, which require large amounts of energy to power and large volumes of water to keep cool. Because AI programs are so complex, they require more energy than other forms of computing. But the trouble is – it’s incredibly hard to nail down exactly how much. As they compete to build ever-more sophisticated AI models, companies like OpenAI – which created ChatGPT – Google and Microsoft will not disclose just now much electricity and water it takes to train and run their AI models, what sources of energy power their data centers, or even where some of their data centers are.
Full story : As the AI industry booms, what toll will it take on the environment?