You use ChatGPT, Claude, or Gemini every day without a second thought. But every prompt you send consumes the equivalent of a water bottle and generates CO2 at a scale that now exceeds the emissions of entire countries. Spoiler alert: the 2025 figures are frankly alarming, and nobody is really talking about it.
Table of Contents
1. The alarming numbers
Honestly, I was floored when compiling this data. A study by VU Amsterdam Institute for Environmental Studies published in Patterns estimates that AI systems will produce between 32.6 and 79.7 million tons of CO2 in 2025.
To put this in perspective: the upper figure exceeds the annual emissions of New York (52.2 Mt), Norway (31.5 Mt), and approaches those of Singapore (53 Mt). We are talking about a technology that most people consider "immaterial."
The thing is, it is only accelerating. The projections for 2030 are even more alarming:
| Indicator | 2024 | 2025 (estimated) | 2030 (projection) |
|---|---|---|---|
| Share of global electricity demand | 1.5% | ~2% | 3% (975 TWh) |
| AI raw power | 9.4 GW | 23 GW | ~50+ GW |
| Annual growth | - | +15%/year | 4x faster than other sectors |
To put the 23 gigawatts by end of 2025 into context: that is the average annual electricity consumption of the United Kingdom. The entire country.
2. The invisible water: AI's hidden thirst
Carbon emissions are already serious. But wait until you see the water figures.
AI consumes between 312.5 and 764.6 billion liters of water per year. This volume equals the entire global bottled water consumption. Yes, you read that right.
And this is where it becomes concrete for you: every interaction with ChatGPT consumes about 500 milliliters of water. One bottle. Multiply by the hundreds of millions of daily users, and you get entire lakes evaporating to run language models.
Some figures that struck me:
- Training GPT-3 alone required 700,000 liters of water
- Microsoft saw its water consumption increase by 34% in 2022 (6.4 billion liters), mainly due to its AI investments with OpenAI
- Nearly one-third of new data centers are being built in areas that will experience acute water shortages by 2050
The truly cynical part? AI will soon be in direct competition with agriculture and domestic use for access to fresh water in certain regions. Game over for sustainable planning.
3. Why AI electricity is so polluting
We tend to see digital technology as something clean, dematerialized. This is a complete illusion.
A generative AI search consumes about 30 times more electricity than a simple Google search. And generative AI is expected to represent at least 50% of data center electricity consumption - an acceleration of 5 to 10 times compared to 2022.
But the worst part is where these data centers are built. A study by Harvard and UCLA reveals that 95% of American data centers are located in areas where electricity sources are 48% more polluting than the American average.
This is no coincidence. Tech giants seek the cheapest electricity - and often, it is also the dirtiest.
And let us not forget GPU manufacturing. The extraction of necessary rare earths generates systematic ecosystem destruction. In China, areas near extraction sites show significantly reduced biodiversity levels and much slower ecological recovery rates.
4. Pros and cons of the AI race
+ The Pros
- Multiplied productivity: real gains across many sectors
- Accelerated research: medical discoveries, climate modeling
- Energy optimization: AI can optimize power grids (2-3 point gain)
- Democratized access: powerful tools accessible to everyone
- Technological innovation: new chips 15x more efficient in development
- The Cons
- Massive carbon footprint: up to 80 Mt of CO2/year in 2025
- Alarming water consumption: equivalent to global bottled water industry
- Lack of transparency: OpenAI, Meta and others do not publish their emissions
- Rebound effect: every optimization invites more consumption
My advice
To be honest, we cannot stop using AI. But we can apply pressure. Favor companies that publish their environmental data (Google has started). Use lighter models when sufficient - a Claude Haiku consumes much less than an Opus for a simple question. And above all, engage politicians: more than 230 environmental groups are calling for a moratorium on new data centers. Change will come from regulation, not from the goodwill of tech giants.
5. Emerging solutions (finally)
The good news? Serious solutions are beginning to emerge.
On the algorithm side
The Zeus framework from the University of Michigan has demonstrated a reduction of up to 75% in energy consumption for training. Compression techniques (quantization, pruning) can reduce model size without sacrificing accuracy. DistilBERT uses 40% fewer parameters than BERT with nearly identical precision.
On the infrastructure side
Microsoft has reported an 80% improvement in Water Usage Effectiveness through closed-loop cooling systems. Immersion cooling (servers submerged in non-conductive fluids) reduces cooling needs by over 90%.
On the energy side
Sam Altman has invested $20 million in Exowatt to power OpenAI's data centers with solar. Google is targeting 100% carbon-free electricity by 2030. Amazon has already matched 100% of its consumption with renewable electricity.
On the regulatory side
The EU has incorporated mandatory environmental impact assessments into its AI Act. In the United States, 7 states have legislated that data centers must pay for grid expansion costs. Maine has imposed a limit: companies cannot dedicate more than 25% of their electrical capacity to data centers.
Frequently Asked Questions
How much energy does a ChatGPT query consume?
A generative AI search consumes about 30 times more electricity than a standard Google search, plus approximately 500ml of water for server cooling.
Can AI become environmentally friendly?
Technically yes. New chips and optimized algorithms enable efficiency gains of 50 to 75%. However, the rebound effect (more efficiency = more usage) risks canceling out these gains without strict regulation.
Which AI companies are the most transparent?
Google and recently OpenAI have started publishing their energy consumption data. Meta, on the other hand, does not disclose its AI-related emissions.
What can I do at my level?
Use lighter models when sufficient, limit unnecessary queries, and support regulatory initiatives. Public pressure matters.
Conclusion
AI stands at a crossroads. Its current energy architecture is fundamentally unsustainable: up to 80 million tons of CO2 per year, hundreds of billions of liters of water, mining extraction that devastates ecosystems.
Three levers are essential: mandatory emissions transparency (like existing ESG standards), massive investments in optimization and renewables, and binding regulations before AI consumes unsustainable shares of our resources.
A technology built on the systematic destruction of the environment cannot be the architecture of the future. And to be honest, if we do not act now, technological solutions - even revolutionary ones - will no longer be enough.
About the author: The EverydayAITech team has been testing and analyzing artificial intelligence tools since 2023. Our mission: democratizing AI by offering practical and honest guides, without unnecessary technical jargon.