AI Can Be a Climate Ally – If We Build It That Way

Insight

AI can illuminate our way forward across tech and other industries. [Photo credit: BlackJack3D/ iStock]

Last year, people typed 365 billion searches into ChatGPT. Google took more than a decade to reach that level of usage. It's a staggering adoption rate, and it’s raising the question, Isn’t all this use draining our energy resources? But here is the thing: AI’s environmental impact isn’t predetermined. It’s engineered. And depending on how we engineer it, AI can either accelerate climate solutions or strain our systems further.  

History is full of stories about transformative technologies. Consider vaccines: they required vast R&D investments but revolutionized public health. Synthetic fertilizers use immense amounts of energy yet support nearly half the world’s food supply. The key to progress lies in how we scale wisely and design the underlying system.  

Artificial intelligence and data server park capacity. [Photo credit: Overearth/ iStock]

With AI, a common concern is that tools like ChatGPT devour energy and drain water supplies and are therefore an environmental indulgence. This is legitimate from an aggregate perspective: In 2023, AI data centers consumed roughly as much energy as entire countries like Germany or France. But when placed in context, the environmental impacts of individual AI use could be modest compared to its transformational benefits if it is designed with care and applied to solve climate change.  

One estimate suggests that generating a single ChatGPT-4o response consumes roughly 0.3 watt-hours—about the same amount of energy it takes to power a lightbulb for five minutes. To equal the energy required to run a standard air conditioner for an hour, you’d need to enter around 10,000 prompts. A flight from New York to London? In terms of emissions, that’s the equivalent of hundreds of thousands of prompts. Even AI’s water use, which has drawn scrutiny, is relatively small in comparison to everyday consumption: You’d need to interact with ChatGPT tens of thousands of times to equal the amount of water it takes to manufacture a single pair of jeans. That said, the aggregate environmental impact can become significant if not scaled sustainably and equitably. 

Ironically, the same technology that's raising environmental concerns could become a critical part of the solution. At the Bezos Earth Fund, we see AI not just as a consumer of energy, but as a potential catalyst for reducing emissions across nearly every sector, especially where progress has been stubbornly slow, with the power to optimize our most energy-intensive systems. By 2035, it could reduce energy use in light industry by 8%. In transportation, AI systems can cut energy consumption by 20%, and in scientific discovery, AI can help identify new materials, effectively doubling research efficiency. 

Optimizing solutions by utilizing AI. [Photo credit: NooMUboN/ iStock]

We need to stop thinking of AI as a rogue actor and start recognizing it as part of the broader climate and energy system—one that can help accelerate progress if we integrate it wisely. That means treating AI’s energy footprint not as an unavoidable cost, but as a design challenge for the entire system—something we can manage and optimize for both climate goals and innovation.  

The Bezos Earth Fund is investing in that future, supporting innovation by backing catalytic technologies and new inventions to address climate challenges, and support the building of systems needed to scale AI sustainably

This starts with systemic upgrades. First, wherever possible, data center infrastructure must be powered with renewables instead of fossil fuels. They should also be designed and sited with care to ensure they expand digital capacity without exacerbating energy scarcity, raising costs, or deepening inequities in vulnerable communities. Second, we need to improve data transparency—today, we lack consistent data on the energy sources and usage patterns of AI models, which is a barrier to smart policy and infrastructure design. Dramatic improvements in the energy efficiency of computing—for example, computer chips have become one hundred times more energy efficient since 2008—must continue apace. Finally, we need better practices for integrating AI into the energy system. Just as electric vehicles can charge during off-peak hours, AI workloads can shift based on grid signals. And waste heat from data centers can be reused for functional purposes, such as heating nearby buildings, a practice that’s already underway in parts of Europe and gaining traction in the U.S. 

Leveraging AI for the world we want to live in. [Photo credit: NanoStockk/ iStock]

None of these changes require us to stop or slow down our use of AI. They require us to simply rethink the infrastructure beneath it and build smarter. 

We’ve done this before. In the 2010s, widespread adoption of LED lighting helped flatten electricity demand in the U.S. despite more devices and population growth. With the right approach, we can do the same for AI. 

AI is not an existential threat to our planet—if we scale it sustainably. That’s why the most important question isn’t, “Should we be using AI?” It’s, “Are we building it for the world we want to live in?” 

Related News

Our Newsletter

Stay Informed