Clean Air Council


Buying the AI Hype? Let’s Unpack Its Environmental Impact 

July 29, 2025 – It seems like no matter how you’re using technology these days, artificial intelligence (AI) is lurking nearby: in a search result, in a navigation menu, or even popping up as a chatbot with a human name. Tech companies are promising a brighter future fueled by AI and investors are buying the hype, literally: during the first quarter of 2025, 57.9% of global venture capital investments went to AI and machine learning.

But what are users getting in return? As society seemingly jumps into the deep end of the AI pool, we need to understand the harms it’s inflicting on the world.

AI Needs Huge Data Centers to Run, And They Are Major Polluters

The data centers that power AI models and their tools (sometimes called hyperscale data centers) require an immense amount of power because every prompt or task requires supercomputers to complete countless calculations. That’s much different than traditional computing or how a search engine works, for instance. As MIT News reports, “researchers have estimated that a ChatGPT query consumes about five times more electricity than a simple web search.”  

This demand for electricity is rising at four times the rate of the overall rise in consumption and, according to one report, is expected to equal that of the entire country of Japan by next year. The power plants that provide the electricity often burn fossil fuels to create it, emitting lots of harmful pollution along the way. Research by Morgan Stanley projects that data center emissions globally will be 2.5bn metric tons of CO2 equivalent by 2030 – that’s like adding an extra 116 million gasoline cars to the road each year.

Take, for example, the case of Memphis, Tennessee and Elon Musk’s xAI. The Tennessee Lookout reports, “The facility’s behemoth methane gas turbines increase Memphis’s smog by 30-60% as they belch planet-warming nitrogen oxides and poisonous formaldehyde around the clock, pollutants linked to respiratory and cardiovascular disease.” 

Despite touting commitments to reduce the carbon emissions which are baking our planet, major tech companies have actually been increasing their emissions due to their expanding AI programs. Between 2020 and 2023, the emissions of Amazon, Alphabet (Google), Meta, and Microsoft have each increased, on average, by 150%. 

They Strain Our Already-Strained Energy Grid

The Energy Department has reported that AI could suck up 12% of the U.S. energy supply by 2028. Meanwhile, the general rise in consumption has strained much of the world’s energy infrastructure already while we struggle to build out renewable alternatives that protect the planet. Meeting the sudden increased demand of these data centers not only sets back the climate goals of tech companies, it sets back progress around the world. 

“It’s not sustainable to keep building at the rate [Google is] building because they need to scale their compute within planetary limits,” one researcher told The Guardian. “We do not have enough green energy to serve the needs of Google and certainly not the needs of Google and the rest of us.”

Similarly, another researcher told MIT News, “The demand for new data centers cannot be met in a sustainable way. The pace at which companies are building new data centers means the bulk of the electricity to power them must come from fossil fuel-based power plants.”

This surge in growth is already impacting how the rest of us receive energy. A 2024 Bloomberg report found data centers are “distorting the normal flow of electricity for millions of Americans,” which stresses power grids and can damage home appliances and power equipment. The scale of these operations means that when things go wrong, they can go really wrong, putting entire regions at risk of power outages.  

They Soak Up Water Like a Giant Sponge

Energy isn’t the only resource AI depends on. If you’ve ever heard a laptop fan turn on or felt a phone warm up after hours of scrolling, you already understand the basics of why data centers also require immense amounts of water. When supercomputers parse through user prompts and train new models, they heat up. To cool them, many data centers run cold water through pipes. 

When tech companies seek sites for their facilities, “Because electricity is more costly for data centers than water, companies often prioritize building their facilities in places with cheap power, even if the area is drought stricken. That has exacerbated water shortages across the world,” a Stanford hydrologist told The New York Times.

Their water consumption makes these sites particularly bad neighbors. The New York Times also reports that in areas with data centers “local wells have been damaged, the cost of municipal water has soared and the county’s water commission may face a shortage of the vital resource.” This has done so much damage in Newton County, Georgia, for example, that it’s on track to be in a water deficit by 2030. 

Supercomputers Aren’t for “Super Users.” They’re Meeting Our Demand

Make no mistake: this massive energy consumption is not only fueling industrial and commercial-scale AI projects. Every instance of generative AI, including public-facing tools, contribute to this waste. 

The International Energy Agency recently studied the impacts of individual user actions. They estimate that the power required to generate a large amount of text is equivalent to running an LED bulb for an hour, and that the power required to create an eight-second video is equivalent to charging a laptop twice. Another study indicated that text generation (like what ChatGPT does) “used 10 times as much energy compared with simple classification tasks like sorting emails into folders.”

It’s no secret that generative AI models frequently “hallucinate” by making up information or otherwise answering queries incorrectly. If you do want the right answer, it turns out sustainability is a significant trade-off. One study found that more accurate (and thus more complex) models produced as much as three times more emissions.

And public AI tools are popular. A 2023 study found that 1 in 4 Americans say they use AI several times a day and another 28% say they use it about once a day. ChatGPT has 400 million weekly active users. That is a lot of energy consumption, and a lot of pollution.  

In the end, fighting the climate crisis is a systemic and collective issue requiring systemic and collective solutions. Plus, tech companies are making it harder and harder to opt out of AI. But as with any system, individual choice does play a role and widespread adoption of resource-intensive, polluting products instead of existing, efficient tools is a big part of the problem. 

The “AI revolution” is not inevitable, despite what public figures would like us to think. Any major shift in technology requires public buy-in. It is up to us all to demand a future that allows us to take advantage of progress without compromising the environment.

Sign up for email alerts arrow right