Sustainable AI: A Beginners Guide

AI's energy footprint is growing fast. Data centers could account for 12% of US electricity by 2028, but most people using AI daily have no idea what their usage actually costs, because the industry has made that information nearly impossible to find. This post breaks down what sustainable AI really means, why the gap between the most and least efficient options is bigger than you'd expect, and how making a few deliberate choices about the tools you use can actually matter.

Sustainability

Mar 2, 2026

What Is Sustainable AI, and Why Should You Care?

A single ChatGPT query uses roughly five times the electricity of a standard web search. That number is easy to read and easy to forget, until you multiply it by the billions of queries happening every day, and then by the pace at which AI adoption is growing.

Data centers already account for around 4.4% of the United States' total electrical consumption. By 2028, that figure could reach 12%. Globally, some projections put AI's share of total electricity demand at up to 21% by 2030. These aren't fringe estimates, they come from institutions like Brookings and Rutgers tracking the infrastructure buildout in real time.

AI is useful enough that most people will keep using it regardless. The more interesting question is whether we can be smarter about how we use it.


The Problem Is Bigger Than Your Electricity Bill

When people think about AI's environmental footprint, they usually think about electricity. That's part of it, but only part. Training a large AI model requires not just massive compute, but also significant water for cooling: approximately 2 liters per kilowatt-hour of energy consumed, according to MIT researchers. GPU production draws on rare-earth minerals. Infrastructure expansion strains power grids that are already under pressure.

And the electricity itself isn't created equal. The same model, running the same workload, can have a carbon footprint that varies by a factor of 30 depending on where in the world it's running and what energy sources power the grid there. A query processed in a data center running on solar and wind has a fundamentally different impact than one processed on a grid heavy with natural gas or coal, even if the bill looks identical.


What Sustainable AI Actually Means

Sustainable AI is the practice of developing and deploying AI systems in ways that minimize environmental harm without abandoning the utility that makes AI worth using in the first place. That's a different goal than simply using less AI, it's about using it more thoughtfully.

In practice, this breaks down into a few dimensions that are easy to think about separately.

  • Energy efficiency is the most obvious one. Not every task needs a frontier model. MIT researchers studying generative AI energy use found that large training clusters consume seven or eight times more energy than a typical computing workload. Models that use extended reasoning can use 30 to 70 times more energy than a standard model for the same apparent output. Routing routine tasks to smaller, lighter models, or running them locally on efficient hardware, can make a dramatic difference without any loss in output quality for most everyday use cases.

  • Where the energy comes from matters just as much as how much of it is used. Cloud providers running on verified renewable energy have genuinely lower carbon intensity than those relying on fossil fuel-heavy grids. The gap between the best and worst options here is enormous.

  • Transparency is the piece the industry has been slowest to embrace. If you don't know what your AI usage actually costs in terms of energy and carbon, you can't make informed decisions about it. Most major providers offer no emissions data whatsoever. You get a bill in tokens and dollars. The environmental costs are externalized and invisible.


What This Looks Like in Practice

Sustainable AI isn't an abstract policy goal, it's a set of concrete decisions about which tools to use and how to use them.

Running a model locally on modern hardware is a good example. Apple Silicon chips, which power current Macs, are remarkably efficient for AI inference — up to 20 times more energy-efficient per token than routing the same request through a large cloud provider. For everyday tasks like drafting text, summarizing documents, or answering questions, a capable local model often produces comparable output to a cloud model at a fraction of the energy cost.

When cloud processing is necessary, the choice of provider matters. Some cloud AI services are beginning to publish real energy and emissions data alongside their API responses. GreenPT, for instance, returns actual energy metrics with every request, which is the kind of transparency that makes it possible to track and reduce your footprint meaningfully. Others offer no such data, making informed comparison nearly very difficult.

This is where Weave comes in. Sustainability tracking is built into the application from the ground up, not bolted on as an afterthought. For local inference, Weave reads actual hardware energy counters from your device: real measured joules, converted to kilowatt-hours, multiplied by the carbon intensity of your local electricity grid. No estimates or approximations. For cloud requests where real data isn't available, Weave uses token counts and model-specific energy factors calibrated against published research to produce the best available estimate, and publishes the full methodology behind those calculations. The goal is to make the invisible visible, and to surface better options when they exist.


The Bigger Picture

AI's energy footprint is not a reason to stop using AI. It's a reason to use it with more awareness than most people currently do.

The good news is that the gap between the most and least sustainable options is large enough that small shifts in behavior (choosing a smaller model for routine tasks, running inference locally when possible, using providers that run on clean energy) can have a meaningful impact. The challenge is that making those choices requires information that the industry has not historically been eager to provide.

That's changing, slowly. And in the meantime, the most useful thing anyone can do is start paying attention to which tools they're defaulting to, where those tools are running, and what they actually cost beyond the subscription fee.

The energy consumed by AI is real. So is the opportunity to do something about it.