AI now almost feels normal as it has become such a huge part of our daily lives. If talking about search engines answering in seconds, to apps creating photos or music within the blink of an eye. All of it seems futuristic and exciting. But while we’re busy enjoying these benefits, there’s a reality we usually skip over, the hidden cost behind all this progress – the growing impact of AI in AI Carbon Emissions.
Running AI isn’t as light and invisible as it looks on our screens. Every single query, chatbot response, or generated image takes a huge amount of energy, and that energy mostly comes from power-hungry data centers. These centers help in leaving behind the AI carbon emissions that are adding up quietly.
Table of Contents
And yes, AI might feel magical and smart, bringing a lot of harm with it by leaving heavy footprints on planet Earth. So, this is the part of the story that we need to discuss today.
How AI Produces Carbon Emissions?
To really understand the issue, we have to look at where these emissions are coming from. AI carbon emissions isn’t just one single thing. It has layers that stretch from the very beginning of building a model to the moment we ask it a question.
Training Models
This is one of the biggest sources of emissions. Training massive neural networks like GPT-4 or Google’s Gemini means feeding them billions or even trillions of data points.
To do this, companies rely on supercomputers powered by thousands of GPUs (graphics processing units). These machines run for weeks or even months nonstop, pulling in enormous amounts of electricity. It’s a process that’s far more energy-hungry than most people realize.
Inference
Once a model is trained, the work doesn’t end. Every single time we type a question or request into an AI system, it uses power again. This process is called “inference.” With millions of users sending prompts daily, these small amounts of energy quickly add up. For a platform that serves the entire world, the emissions from inference alone are staggering.
Cooling Data Centers
Another hidden part of the footprint comes from keeping servers cool. These machines get hot when they work around the clock, and to prevent overheating, data centers use both electricity and water.
MIT News (2025) reported that about 2 liters of water are needed per kilowatt-hour of data center electricity just to cool servers. That means AI doesn’t just use electricity. It also uses huge amounts of water.
Hardware Production
The story doesn’t start when you click “send.” Before AI can even exist, powerful hardware like GPUs, chips, and servers has to be manufactured.
The things that carry emissions within themselves, which may include the mining of rare earth metals, shipping machines globally, and running factories. They are the big pieces of the puzzle that are often left out of the conversation due to their being invisible.
Related Pick: Agricultural Growth and Forest Loss
Though when talking about AI carbon emissions, it is not just revolving around the electricity used in one single chat. The chain starts from the factory floor, runs through data centers that never take a break, and continues every single time we interact with these systems.
Training vs. Inference – Which Pollutes More?
There’s a common belief that training is the “big monster” of AI emissions, while inference is just a small part of the story. At first glance, this makes sense. Training requires weeks or months of nonstop computing power, which sounds like the ultimate energy drain. But the reality is not that simple.
Training costs
Training a large AI model is like building the engine of a giant machine. Back in 2019, researchers discovered that training a single big model released about 626,000 pounds of CO₂. To put that in perspective, that’s the same as the lifetime emissions of five average cars (Columbia University).
Later, when OpenAI trained GPT-3, it consumed a jaw-dropping 1,287 megawatt-hours of electricity and produced 502 metric tons of CO₂. That’s about the same as driving 112 cars for an entire year. Training is undeniably huge, and the numbers prove it.
Inference costs
But here’s the twist that inference is actually becoming the larger problem. Once a model is built, it doesn’t just sit idle. Every question we type into ChatGPT, every image generated, every little request, it all use energy.
Microsoft and Google now say that inference is responsible for about 60% of ongoing AI energy use, while training is only 40%. Think about that. The majority of emissions are now coming from the daily queries, not the initial training. What makes it worse is the scale: millions of people are using AI tools every second.
Related Pick: Deforestation Prevention Methods
Each query might seem small, but combined, they add up to a massive carbon footprint. In fact, a single chatbot request can use 10–100 times more energy than a basic Google search (Columbia University).
So while training is like one giant, dramatic event, inference is more like a constant drip that never ends. It’s easy to focus on the big numbers from training, but the ongoing energy drain of inference is what really keeps the emissions piling up. And as AI grows more popular, inference is only going to expand, making it the silent but bigger polluter over time.
The Rising Curve – AI in 2025 vs. Pre-AI Era
Before the boom of generative AI, data centers were already known for consuming a lot of electricity, but the growth was steady and somewhat predictable. Companies like Google, Amazon, and Microsoft had been running cloud services for years, and while energy use was high, it didn’t spike overnight. But once tools like ChatGPT, Google Gemini, Anthropic’s Claude, and MidJourney became mainstream, everything changed.
According to MIT (2025), the demand for data centers in North America almost doubled in a single year, jumping from 2,688 megawatts in 2022 to 5,341 megawatts in 2023. Most of this demand came directly from generative AI workloads. It’s an explosion, as this kind of growth has not been seen by the industry before.
The same story is told through the numbers globally. There are about 460 terawatt-hours (TWh) of electricity used by the data centers in 2022. It’s something that is comparable to the total consumption of electricity in Saudi Arabia annually. But projections show that by 2026, this figure could pass 1,050 TWh, which is about the same as the energy use of an entire country like Japan or Russia (MIT).
The IMF (2024) also warned that AI alone could add 1.3 to 1.7 gigatons of CO₂ emissions by 2030. To put that in perspective, it’s like adding another entire country’s worth of emissions into the world’s already struggling climate system.
Water, Hardware & E-waste – The Overlooked Dimension
When people talk about the impact of AI, the focus is often on electricity use and carbon emissions. But the truth is, AI also places a heavy burden on water, hardware, and waste—dimensions that are often ignored. AI may look like “cloud data” floating around, but behind the screen, it consumes real, physical resources.
Take water, for example. Data centers need massive cooling systems to keep servers from overheating. By 2027, AI is expected to use between 4.2 and 6.6 billion cubic meters of water every year, which is more than half of what the UK consumes in a year. Training GPT-3 alone required about 700,000 liters of water, enough to make hundreds of thousands of plastic water bottles.
The demand for specialized hardware is also climbing fast. In 2023, data centers received 3.85 million GPU shipments, compared to 2.67 million in 2022. These chips don’t just appear out of thin air. They require mining rare materials, heavy manufacturing, and complex global supply chains. Each stage adds more emissions and environmental damage.
Then comes e-waste. Old servers, chips, and hardware eventually pile up as toxic waste. By 2030, AI could generate 1.2 to 5 million metric tons of e-waste, which would be nearly 12% of all global electronic waste. These discarded parts often end up in landfills or get shipped to developing countries, creating pollution and health risks.
Microsoft, Google & Big Tech’s Reality Check
Big Tech companies often present themselves as champions of sustainability, promising to reach net-zero emissions and run fully on renewable energy. But when it comes to AI, even the largest players are finding it hard to keep those promises. The demand for AI is growing so quickly that it is pushing its emissions higher, despite years of green initiatives.
Take Microsoft as an example. In 2024, the company admitted that its AI carbon emissions had gone up by 23.4% since 2020. A big reason for this increase was the explosion of AI workloads, which alone pushed its energy use up by 168%. This is a huge jump for a company that has invested heavily in wind, solar, and other renewable projects.
Google is facing a similar challenge. The company revealed that since 2019, its emissions have increased by 48%, and most of this rise comes directly from the electricity needed to power AI models. Despite years of progress in cutting emissions from other operations, AI has essentially undone much of that work.
And it’s not just about individual companies. A report by the International Telecommunication Union (ITU) showed through a report that AI-driven technology companies as a whole have seen their emissions jump by 150% since 2020. Whereas, in 2023 alone, the sector consumed 581 terawatt-hours of electricity, so this makes about 2.1% of all global electricity use. Putting that into perspective is that it is almost as much power as the country of Canada uses each year.
Efficiency Gains – A Silver Lining That Isn’t Enough
There is some good news in the middle of all these concerns. AI companies are working hard to make their systems more efficient. Comparing it with the past, now each individual’s query or prompt uses less water and energy. For instance, as in 2025, Google shared that now one single Gemini AI prompt consumes 0.24 watt-hours of electricity and 0.26 milliliters of water. However, this marks a reduction of 33x energy use and a 44x drop in emissions per prompt, if compared with the previous year. It is an impressively huge improvement.
The same pattern shows up in costs, too. The price of inference fell from $20 to just $0.07 per million tokens, as per the AI Index 2025 from Stanford HAI. That is a change that not only reflects cheaper hardware but also goes for smarter and less energy-hungry chips. These chips have the ability to do more with less. On the surface, it might look like a win for the companies and the planet.
We, here, have the catch, which is that as AI gets cheaper and more efficient, its usage starts to grow dramatically. AI is being used by millions of people every day for multiple purposes, like school, work, or entertainment. Rather than working to reduce overall emissions, the efficiency gains are being wiped out through this demanding massive increase.
Prompt Complexity – Why Some Questions Burn More Carbon?
When we think about AI emissions, it’s easy to assume that every question has the same cost. But research shows that’s not true at all. Some prompts, especially the complex ones, take a lot more energy to process. A 2025 study published in Frontiers found that reasoning-heavy prompts can release up to 50 times more CO₂ than short, straightforward queries.
For example, a simple fact-checking question or a quick definition uses far less energy compared to a long, step-by-step reasoning task. The study noted that subjects like abstract algebra consumed 6 times more emissions than something basic like high school history. Similarly, bigger reasoning models, such as Cogito with 70 billion parameters, used about 3 times more emissions than smaller models.
What this tells us is that user behavior matters. The type of questions asked by us and how often we are asking them is something that directly affects the carbon footprint. Perhaps, there is nobody who is saying to stop asking complicated questions, despite being aware of the hidden cost, helping to make us more mindful in terms of how we should be interacting with AI tools.
The Big Picture – AI’s Carbon Cost vs. Its Climate Promise
On one hand, the AI adds a million tons of CO₂, and on the other side same technology is used for designing solutions for the betterment of the clever mind. Basically, AI is playing both the role of a problem as well as the problem-solver.
A 2025 report from the LSE Grantham Institute has estimated through a report in 2025 that AI has a role in reducing 3.2 to 5.4 billion tons of CO₂ yearly by it reaches to 2035.
That’s a huge number, equal to removing hundreds of millions of cars from the road. How? By helping us run smarter energy grids, predicting power demand more accurately, and cutting waste in industries like shipping and transport. AI is also being used to track carbon emissions better, which makes it easier for governments and companies to stay accountable.
Some businesses are already testing this. For example, Sustamize uses AI to improve carbon tracking in supply chains, while Supermicro applies AI to cut energy use in data centers themselves. As the damage AI brings, with that similar potential, it balances out the damage it has created.
Right now, the growth of emissions is faster than the growth of solutions. That makes it clear, we need to push harder on the positive side if AI is truly going to be part of the climate solution.
Conclusion
AI carbon emissions are no longer just a tech concern. They are a climate concern. From energy-hungry training and daily inference to water use, hardware production, and e-waste, the footprint of AI keeps growing. In 2025, Google’s emissions jumped 48% and Microsoft’s rose 23%, while global data center demand is on track to exceed 1,050 TWh by 2026. Even though efficiency per prompt has improved, rising usage cancels out those savings, leaving the net impact still heavy.
At its core, AI is not just “virtual.” It’s deeply physical, tied to electricity, water, and rare materials. And its carbon shadow is something the world can’t afford to ignore.
You Might Like to Read: What is Reforestation?