I've been ignoring a question for months. Not because I don't think it matters, but because I was fairly sure I wouldn't like the answer.
The environmental impact of AI. What does my usage actually cost the planet? Is every prompt I type melting a glacier somewhere?
I should say upfront: I'm not coming at this from a position of not caring. I'm a Green Party agent. We run two EVs. I compost, I recycle, I do the things. The environment isn't something I think about occasionally - it's baked into how I vote, how I live, what I drive, why I'm training as a Forest School leader. Which is exactly why this question has been nagging at me. Because if there's a genuine tension between my green values and my daily AI usage, I need to look at it honestly rather than hoping it goes away.
I've spent the last seven months building my entire workflow around AI, and while I'd thought about the environmental cost in passing, I'd never actually sat down and properly worked out the numbers. The tension between "this tool makes me dramatically more productive" and "this tool is powered by a building that drinks electricity like I drink tea" is real, and I'd been avoiding it.
So I did what any reasonable person would do. I asked the AI to help me calculate the environmental impact of using the AI.
Yes, I see the irony. Move on.
The Calculation
I wanted a number. Not a vague "it's probably fine" or "it's definitely terrible" - an actual, tangible number I could hold up and look at honestly. So I dug into the research, ran my usage through some energy consumption estimates, and tried to work out what my thirty days of fairly heavy AI use actually costs the planet.
The answer: roughly equivalent to driving sixty-six miles.
Sixty-six miles. To put that in terms my daily life actually understands, that's about one and a half school runs - drop-off in the morning and getting there for pickup, but not making it home. A month of heavy AI usage, and I haven't even completed two round trips to school.
For context, US data centres consumed 183 terawatt-hours of electricity in 2024 - over four per cent of the country's total consumption, roughly equivalent to the annual demand of Pakistan.1 By 2030, that's projected to grow by 133 per cent.2 A dedicated AI query has been estimated to use up to ten times more electricity than a traditional web search - though that distinction is increasingly academic now that Google has bolted AI responses onto most searches, quietly inflating the energy cost of something billions of people do without thinking about it.3 The numbers at scale are genuinely eye-watering.
But my personal slice of that? Sixty-six miles. A rounding error on a rounding error.
The Bit Where I Nearly Stopped Thinking About It
This is the point where it would be very easy - and very convenient - to say "well, sixty-six miles is nothing, guilt absolved, back to work." And honestly, that was my instinct for about ten minutes.
But that's not actually an argument. "My individual contribution is tiny" is the exact same logic that people use to justify not recycling, not voting, and not switching off lights. The whole point of collective impact is that everyone's individual contribution is tiny. That's what makes it collective.
So I sat with the discomfort for a bit. And I kept digging.
The Boycott Argument
The obvious green answer is: stop using it. If enough people refuse, that sends a signal. Companies respond to demand. Collective abstention is how you force change. That's the argument I keep turning over in my head, because it's the one that lines up most neatly with how I think about every other environmental issue.
And it's not wrong in principle. There's real precedent. Consumer boycotts have shifted corporate behaviour on palm oil, fast fashion, single-use plastics. Collective "we're not having this" has genuine power when the signal is visible, attributable, and the alternative is clear.
The problem is that AI usage doesn't work like a boycott target. Not even slightly.
If a million people quietly stop using Claude or ChatGPT tomorrow, the companies see churn. They attribute it to competition, pricing, product issues. There is no mechanism connecting your abstention to an environmental message. You're not standing outside a shop with a placard. You're just... not logging in. The signal is invisible and unattributable.
More fundamentally, the infrastructure is being built regardless. Microsoft, Google, Meta, and Amazon committed over $350 billion to data centre spending in 2025 alone.4 That buildout is happening whether you personally use AI today or not. The demand isn't what's driving the carbon problem - the speculative investment in future demand is.
And here's the thing that really bothers me: if I stopped, who would notice? I'd lose the productivity. The data centres would keep running. And the people most likely to voluntarily abstain for environmental reasons - people like me, who actually care about this stuff - are exactly the people who should be advocating loudly for renewable-powered data centres instead. Removing yourself from the conversation doesn't change the conversation.
What I Can Do With Sixty-Six Miles
Let me flip it around. What does that sixty-six miles of carbon actually buy me?
In the last thirty days, I've used AI to ship features across Task Board, TestPlan, and CoSurf that would have taken me weeks longer without it. I've debugged Blazor component lifecycle issues in minutes instead of hours. I've written MCP server integrations, reviewed pull requests, caught architectural problems before they became expensive, and built an entire test suggestion engine for TestPlan that generates coverage recommendations from a codebase analysis.
As a CTO running development teams across multiple products, the productivity gain isn't marginal. It's the difference between shipping and not shipping. Between growing and stagnating. Between competing and falling behind.
Is that worth sixty-six miles of driving? I do more than that in school runs in a week. And those are in an EV, so maybe the comparison isn't entirely fair - but the point stands. We spend carbon on things we value. The question is whether the value justifies the cost.
The System Problem
Here's where I land, and I know not everyone will agree with me.
I believe that the environmental impact of AI is a genuine, serious, real problem. I also believe that it can only be solved at the system level. Not by individuals choosing not to use it, but by the infrastructure being powered differently.
And the good news - if you can call it that - is that this is already happening, driven largely by economics rather than conscience. Renewables are the fastest-growing energy source for data centres. The IEA reports renewable energy production for data centres growing at 22 per cent per year, expected to cover nearly half of additional demand by 2030.5 Microsoft signed a $10 billion renewable energy deal with Brookfield Asset Management, deploying over 10.5 gigawatts of renewable capacity from 2026 - equivalent to ten nuclear power plants.6 They also signed the world's first fusion energy purchase agreement. Google acquired renewable energy developer Intersect Power for $4.75 billion, gaining multiple gigawatts of projects co-located with data centres.7
The same AI query run on a renewables-powered grid versus a coal-powered grid is a completely different environmental proposition. The key metric isn't total energy consumption - it's carbon intensity per kilowatt-hour. And that number is moving in the right direction, fast.
But - and this is the honest bit - it's not solved. The IEA forecasts that by 2030, approximately 40 per cent of additional data centre energy will still come from gas and coal.8 Around 60 per cent of data centre energy today comes from fossil fuels.9 The transition is happening, but it's not happening fast enough to make the problem go away on its own.
The Stronger Green Position
What I genuinely believe is that the most effective environmental position isn't "don't use AI." It's "use AI, and be loud about how it should be powered."
Demand that Anthropic, OpenAI, and Google publish real-time carbon intensity data per query. Support regulation requiring data centres to meet renewable energy thresholds. Push back when companies make vague "net zero by 2030" pledges with no specifics behind them. Choose providers who are genuinely co-locating with renewables over those burning gas.
That's targeted, visible, attributable pressure. It's the opposite of quietly not logging in and hoping someone notices.
If nobody pushes back in any way and usage grows unchecked with zero accountability, the industry has no incentive to prioritise clean infrastructure over fast infrastructure. The question is whether abstention is the right form of pushback. And for a diffuse, invisible, unattributable action like not opening an app, it almost certainly isn't.
The Uncomfortable Honesty
I'm not going to pretend this is a clean conclusion. It isn't.
There's a version of this argument where I'm just rationalising my own usage because it's convenient. I know that. The "it's a system problem, not my problem" framing is exactly what oil companies spent decades promoting to shift responsibility from producers to consumers, and then from consumers back to nobody in particular. I'm aware of the parallel.
But I also think there's a meaningful difference between "don't drive your car" and "demand that car manufacturers build electric vehicles." One of those strategies has actually worked. The other gave us thirty years of individual guilt and no systemic change.
My sixty-six miles of AI carbon is real. The 183 terawatt-hours of data centre consumption is real. The projected growth is real. But so is the renewable transition, the economic pressure driving it, and the fact that vocal, organised, policy-level demand for clean infrastructure is a more powerful lever than a million people quietly closing their laptops.
Use the tool. Demand better infrastructure. And stop pretending that individual guilt is a substitute for systemic accountability.
I'm not fully at peace with this, if I'm honest. Part of me still feels the pull of "just stop using it" - it's simpler, it's cleaner, and it doesn't require me to trust that corporations will do the right thing, which historically is not a great bet. But I keep coming back to the same conclusion: the goal is less carbon, cleaner infrastructure, a liveable planet. And I think I do more for that goal with a keyboard and a voice than with a closed laptop and a clear conscience.
Now if you'll excuse me, I've got a school run to do. Sixty-six miles of guilt, give or take.
References
- IEA - Energy and AI: Energy Demand from AI (April 2025)
- Pew Research Center - What We Know About Energy Use at US Data Centers (October 2025)
- Epoch AI - How Much Energy Does ChatGPT Use?; Google Cloud - Measuring the Environmental Impact of AI Inference (August 2025)
- IEEE ComSoc - AI Spending Boom: Big Tech to Invest $400 Billion in 2025
- IEA - Energy and AI: Energy Supply for AI (April 2025)
- CNBC - Microsoft and Brookfield to Develop More Than 10.5 GW of Renewable Energy (May 2024)
- Energy Storage News - Google Completes $4.75 Billion Intersect Power Acquisition
- IEA - Energy and AI: Energy Supply for AI (April 2025)
- Carbon Brief - AI: Five Charts That Put Data Centre Energy Use Into Context