Artificial Intelligence (AI) is already consuming energy equivalent to a small nation, and this is just the beginning. The power required to support data storage is predicted to double by 2026. However, you can play a role in preventing this.

Brian Calvert, an environmental journalist based in Pacifica, California, focuses on environmental ethics and climate change. He is an associate editor of the quarterly environmental magazine Earth Island Journal and a former editor-in-chief of High Country News. He has also received the Ted Scripps Fellowship at CU Boulder’s Center for Environmental Journalism.

In January, the International Energy Agency (IEA) released its forecast for global energy use for the next two years. For the first time, it included projections for electricity consumption related to data centers, cryptocurrency, and artificial intelligence.

According to the IEA, these uses accounted for nearly 2 percent of global energy demand in 2022. This demand could double by 2026, equating to the total electricity consumption of Japan.

In our digital age, many processes that govern our lives are hidden within computer code. Machines monitor us behind the scenes, billing us when we cross toll bridges, guiding us across the internet, and delivering music we didn’t even know we wanted. All of this requires materials to build and operate — plastics, metals, wiring, water — and all of these come with costs. These costs necessitate trade-offs.

Energy is the most significant of these trade-offs. As the world warms towards increasingly dangerous temperatures, we need to conserve as much energy as possible

Energy is the most significant of these trade-offs. As the world warms towards increasingly dangerous temperatures, we need to conserve as much energy as possible to reduce the amount of climate-heating gases we emit.

This is why the IEA’s figures are so crucial, and why we need to demand more transparency and greener AI in the future. It’s also why we need to be mindful consumers of new technologies, understanding that every piece of data we use, save, or generate has a tangible cost.

One of the fastest-growing areas of energy demand is a form of machine learning known as generative AI. It requires a lot of energy for training and producing responses to queries. Training a large language model like OpenAI’s GPT-3, for example, uses nearly 1,300 megawatt-hours (MWh) of electricity, equivalent to the annual consumption of about 130 US homes. The IEA states that a single Google search takes 0.3 watt-hours of electricity, while a ChatGPT request takes 2.9 watt-hours. If ChatGPT were integrated into the 9 billion searches performed each day, the electricity demand would increase by 10 terawatt-hours a year — the amount consumed by about 1.5 million European Union residents.

One of the fastest-growing areas of energy demand is a form of machine learning known as generative AI

Brian Calvert recently had a conversation with Sasha Luccioni, the lead climate researcher at an AI company called Hugging Face. This company provides an open-source online platform for the machine learning community that supports the collaborative, ethical use of artificial intelligence. Luccioni has been researching AI for over a decade, and she understands how data storage and machine learning contribute to climate change and energy consumption — and how they are set to contribute even more in the future.

She was asked what we can do to be better consumers of this energy-hungry technology. Here’s a condensed version of their conversation:

Brian Calvert

AI seems to be omnipresent. I’ve attended meetings where people humorously suggest that our machine overlords might be eavesdropping. What exactly is artificial intelligence? Why is it garnering so much attention? And why should we be concerned about it now, rather than in some far-off future?

Sasha Luccioni

Artificial intelligence has been a field since the 1950s, and it has experienced various “AI winters” and “AI summers.” Whenever a new technique or approach is developed, there’s a lot of excitement, which inevitably leads to disappointment, triggering an AI winter.

We’re currently experiencing a bit of an AI summer with generative AI. We should definitely remain critical and consider whether we should be using AI, or specifically generative AI, in applications where it wasn’t used before.

Brian Calvert

What do we know about the energy costs of this hot AI summer?

Sasha Luccioni

It’s challenging to quantify. With an appliance, you plug it into your socket, and you know what energy grid it’s using and roughly how much energy it’s consuming. But with AI, it’s distributed. When you’re doing a Google Maps query, or you’re talking to ChatGPT, you don’t really know where the process is running. And there’s really no transparency with regard to AI deployment.

From my research, I’ve found that switching from a non-generative, traditional AI approach to a generative one can use 30 to 40 times more energy for the exact same task. So, it’s adding up, and we’re definitely seeing the big-picture repercussions.

Brian Calvert

So, in material terms, we’ve got a lot of data, we’re storing a lot of data, we’ve got language models, we’ve got models that need to learn, and that takes energy and chips. What kind of things need to be built to support all this, and what are the environmental real-world impacts that this adds to our society?

Sasha Luccioni

Static data storage [like thumb drives] doesn’t, relatively speaking, consume that much energy. But the thing is that nowadays, we’re storing more and more data. You can search your Google Drive at any moment. So, connected storage — storage that’s connected to the internet — does consume more energy, compared to nonconnected storage.

Training AI models consumes energy. Essentially you’re taking whatever data you want to train your model on and running it through your model like thousands of times. It’s going to be something like a thousand chips running for a thousand hours. Every generation of GPUs — the specialized chips for training AI models — tends to consume more energy than the previous generation.

They’re more powerful, but they’re also more energy intensive. And people are using more and more of them because they want to train bigger and bigger AI models. It’s kind of this vicious circle. When you deploy AI models, you have to have them always on. ChatGPT is never off.

Brian Calvert

Then, of course, there’s also a cooling process. We’ve all felt our phones heat up, or had to move off the couch with our laptops — which are never truly on our laps for long. Servers at data centers also heat up. Can you explain a little bit how they are cooled down?

Sasha Luccioni

With a GPU, or with any kind of data center, the more intensely it runs, the more heat it’s going to emit. And so in order to cool those data centers down, there’s different kinds of techniques. Sometimes it’s air cooling, but majoritarily, it’s essentially circulating water. And so as these data centers get more and more dense, they also need more cooling, and so that uses more and more water.

Brian Calvert

We have an AI summer, and we have some excitement and some hype. But we also have the possibility of things scaling up quite a bit. How might AI data centers be different from the data centers that we already live with? What challenges will that present from an ecological or environmental perspective going forward?

Sasha Luccioni

Data centers need a lot of energy to run, especially the hyperscale ones that AI tends to run on. And they need to have reliable sources of energy.

So, often they’re built in places where you have nonrenewable energy sources, like natural gas-generated energy or coal-generated energy, where you flip a switch and the energy is there. It’s harder to do that with solar or wind, because there’s often weather factors and things like that. And so what we’ve seen is that the big data centers are built in places where the grid is relatively carbon intensive.

Brian Calvert

What kinds of practices and policies should we be considering to either slow AI down or green it up?

Sasha Luccioni

I think that we should be providing information so that people can make choices, at a minimum. Eventually being able to choose a model, for example, that is more energy efficient, if that’s something that people care about, or that was trained on noncopyrighted data. Something I’m working on now is kind of an Energy Star rating for AI models. Maybe some people don’t care, but other people will choose a more efficient model.

Brian Calvert

What should I think about before upgrading my data plan? Or why should I hold off on asking AI to solve my kid’s math homework? What should any of us consider before getting more gadgetry or getting more involved with a learned machine?

Sasha Luccioni

In France, they have this term, “digital sobriety.” Digital sobriety could be part of the actions that people can take as 21st-century consumers and users of this technology. I’m definitely not against having a smartphone or using AI, but asking yourself, “Do I need this new gadget?” “Do I really need to use ChatGPT for generating recipes?” “Do I need to be able to talk to my fridge or can I just, you know, open the door and look inside?” Things like that, right? If it ain’t broke, don’t fix it with generative AI.

11 COMMENTS

  1. AI is a game-changer, no doubt, but this energy consumption is scary. We’re talking about a whole nation’s worth of electricity by 2026? Hmm. Can’t progress come without trashing the planet?

  2. Love the “Energy Star rating” idea for AI models! Imagine comparing search engines based on how eco-friendly they are. Transparency is key here. Let consumers choose the AI that fits their values, not just their needs.

  3. This “digital sobriety” movement sounds a bit extreme. Are we supposed to chuck our phones and live in caves now? AI can be a force for good, like helping with climate change research. We just gotta find a balance

  4. I’m all for mindful AI use. Do I really need a recipe from a fancy language model when there are cookbooks collecting dust on my shelf? A little digital detox and some critical thinking can go a long way.

  5. The focus on data center cooling is interesting. Energy use is bad enough, but what about security vulnerabilities? These massive AI data centers could be a hacker’s dream, and a data breach could have a domino effect on energy grids and other critical infrastructure

  6. These big tech companies pushing cloud-based AI are gonna be swimming in cash while the environment suffers. They’ll be happy to sell us their “green AI” solutions later, for a hefty premium, of course. This whole thing feels like a giant cash grab disguised as environmental concern

  7. Let’s not get bogged down in fear-mongering. AI is here to stay, and it has the potential to solve some of our biggest problems. We can address the energy consumption issue through innovation and responsible development. Focus on the solutions, not just the challenges

  8. The article mentions data center reliance on non-renewable energy sources. What about decentralized AI solutions running on the blockchain? Could that be a more sustainable approach? Spreading the processing power out seems more eco-friendly than giant data centers

  9. This whole conversation is missing the point. AI is becoming a self-fulfilling prophecy. The more energy we pour into it, the more it tells us we need even more for “climate change research” or some other worthy-sounding cause. It’s a bottomless pit, and we’re the frogs slowly getting boiled.