AI’s Growing Energy Appetite: Can Innovation Make It Greener?

Artificial intelligence might feel like magic, but behind the curtain it runs on very real electricity and water. Every query to a large language model or image generator triggers calculations in power-hungry data centers. OpenAI’s CEO Sam Altman recently revealed some eye-opening stats: an average ChatGPT query uses about 0.34 watt-hours of electricity and 0.000085 gallons of water. That’s roughly the electricity an oven consumes in just over one second (or an LED bulb in a couple minutes), and about one-fifteenth of a teaspoon of water per query. On an individual level it sounds trivial, hardly more than a blink of an eye or a few drops. But multiply that by millions or billions of AI queries and tasks, day after day, and you start to see a hidden energy hunger with real environmental implications.
AI may be virtual, but its environmental footprint is painfully real.
Most users never see this resource consumption, it’s out of sight in distant server farms, so it’s easy to overlook. In fact, when I’ve discussed AI’s energy footprint in classes or talks, many people (even tech-savvy students) are surprised by how much power these “virtual” tools quietly devour. Yet as AI becomes more ingrained in our lives, its environmental impact is something we need to bring front and center in the conversation. The question isn’t whether AI uses a lot of energy (it does); the question is how we respond, through innovation, policy, and perhaps a new kind of social contract ensuring AI earns its keep on this planet.
The Soaring Footprint: From Queries to Power Plants
Those fractions of a watt-hour per query add up. Tech giants’ data centers, the digital factories powering cloud computing and AI, already consume around 4–5% of all electricity in the United States. As AI workloads have ramped up, that share is climbing fast. A Department of Energy–funded report found data center energy use more than doubled from 2017 to 2023 (as AI took off), reaching ~176 terawatt-hours in 2023 (4.4% of U.S. power). By 2028, data centers could be using 6.7% to as much as 12% of all U.S. electricity, roughly double or triple today’s level in just five years. For context, that’d mean our servers are drawing on the order of one-tenth of the nation’s entire power grid to keep our AI and digital lives running. Globally, one analysis predicts AI specifically could consume nearly half of all data center electricity by 2025, possibly overtaking the infamously huge energy appetite of Bitcoin mining.
Such demand is spurring unprecedented moves in the energy sector. In a headline-grabbing deal last year, Microsoft signed a 20-year agreement to restart a nuclear reactor at Three Mile Island, the site of the 1979 meltdown, just to secure 835 MW of clean power for its AI cloud data centers. It’s the first time a shuttered U.S. nuclear plant will be un-mothballed, and it highlights how seriously companies are bracing for the power surge coming from AI. Constellation Energy (the plant’s owner) noted that the additional output will feed Microsoft’s “energy-hungry” AI data centers across multiple states. The fact that a tech company is effectively jump-starting a dormant nuclear facility (at an estimated $1.6 billion investment) speaks volumes: scaling AI may require tapping every trick in the book for more power, preferably carbon-free. As Microsoft’s energy VP put it, this is part of an effort to decarbonize the grid while meeting massive new capacity and reliability needs.
The intelligence of the future may be limited not by algorithms, but by amperage.
It’s not only electricity. Data centers also guzzle water for cooling. Altman’s blog note of 0.000085 gallons per ChatGPT query might sound negligible, but over millions of queries it translates to substantial water usage (a million prompts would use ~85 gallons of water, and ChatGPT likely handles far more than that on a busy day). A Washington Post analysis found that a 100-word AI-generated email (via GPT-4) might indirectly consume “a little more than one bottle” of water when you factor in cooling needs. The exact numbers vary by location and cooling tech, but the point is: AI’s resource footprint extends beyond electricity to include water and carbon emissions as well. One recent investigation suggested the big cloud companies may be underreporting their data center emissions by a factor of 7.6, actual carbon output potentially 662% higher than what’s officially disclosed. This gap in transparency makes it hard for the public to even grasp the true environmental cost of our AI habits.
All of this might sound a bit doom-and-gloom, AI = power hog. But it’s only one side of the story. The other side is about innovation racing to curb that footprint, finding ways for AI to do more with less energy. In other words, if the status quo is that intelligence = high electricity bills, the emerging challenge is to rewrite that equation.