It only takes a few minutes in a microwave to explode a potato you haven’t ventilated, but it takes as much energy as running that microwave for over an hour and more than a dozen potato explosions for an AI model to make a five-second video of a potato explosion.
A new study from MIT Technology Review has laid out just how hungry AI models are for energy. A basic chatbot reply might use as little as 114 or as much as 6,700 joules, between half a second and eight seconds, in a standard microwave, but it’s when things get multimodal that the energy costs skyrocket to an hour plus in the microwave, or 3.4 million joules.
It’s not a new revelation that AI is energy-intensive, but MIT’s work lays out the math in stark terms. The researchers devised what might be a typical session with an AI chatbot, where you ask 15 questions, request 10 AI-generated images, and throw in requests for three different five-second videos.
You can see a realistic fantasy movie scene that appears to be filmed in your backyard a minute after you ask for it, but you won’t notice the enormous amount of electricity you’ve demanded to produce it. You’ve requested roughly 2.9 kilowatt-hours, or three and a half hours of microwave time.
What makes the AI costs stand out is how painless it feels from the user’s perspective. You’re not budgeting AI messages like we all did with our text messages 20 years ago.
AI energy rethink
Sure, you’re not mining bitcoin, and your video at least has some real-world value, but that’s a really low bar to step over when it comes to ethical energy use. The rise in energy demands from data centers is also happening at a ridiculous pace.
Data centers had plateaued in their energy use before the recent AI explosion, thanks to efficiency gains. However, the energy consumed by data centers has doubled since 2017, and around half of it will be for AI by 2028, according to the report.
This isn’t a guilt trip, by the way. I can claim professional demands for some of my AI use, but I’ve employed it for all kinds of recreational fun and to help with personal tasks, too. I’d write an apology note to the people working at the data centers, but I would need AI to translate it for the language spoken in some of the data center locations. And I don’t want to sound heated, or at least not as heated as those same servers get. Some of the largest data centers use millions of gallons of water daily to stay frosty.
The developers behind the AI infrastructure understand what’s happening. Some are trying to source cleaner energy options. Microsoft is looking to make a deal with nuclear power plants. AI may or may not be integral to our future, but I’d like it if that future isn’t full of extension cords and boiling rivers.
On an individual level, your use or avoidance of AI won’t make much of a difference, but encouraging better energy solutions from the data center owners could. The most optimistic outcome is developing more energy-efficient chips, better cooling systems, and greener energy sources. And maybe AI’s carbon footprint should be discussed like any other energy infrastructure, like transportation or food systems. If we’re willing to debate the sustainability of almond milk, surely we can spare a thought for the 3.4 million joules it takes to make a five-second video of a dancing cartoon almond.
As tools like ChatGPT, Gemini, and Claude get smarter, faster, and more embedded in our lives, the pressure on energy infrastructure will only grow. If that growth happens without planning, we’ll be left trying to cool a supercomputer with a paper fan while we chew on a raw potato.