We did the math on AI’s energy footprint. Here’s the story you haven’t heard.
The emissions from individual AI text, image, and video queries seem small—until you add up what the industry isn’t tracking and consider where it’s heading next.
I use AI almost every day, so I do expect that energy consumption of AI in datacenters will grow and grow each year. However, I reckon that AI offsets that energy consumption against lesser use of more power hungry laptops and desktop PCs because you can use AI on your phone instead of going to your laptop or desktop PC. I have a dedicated PC that I have been using to run AI locally. It's hardly in use now because I find it more convenient to go online. Compare the amount of electrity my phone uses against that of my AI PC, 64 cores and tons of memory. That is one or two orders of magnitude in energy savings.
Google announced that the number of tokens they process went up 50x in one year. However, in the same year, their
Ironwood (TPUv7) offers massive leaps in compute (5x), HBM capacity (6x), and power efficiency (2x) over their 2024 Trillium (v6). Performance is estimated to be within 5% of an Nvidia B200, despite TPUv7’s single primary compute die design. The v7 delivers 4,614 TFLOP/s peak performance. In a few years, virtually everyone on earth is going to use AI daily. It will become as an essential resource as the internet has been.