December 23, 2024

AI is becoming a bigger and bigger problem for the climate. Can “digital sobriety” help?

The world is on the cusp of an AI revolution. But beneath the excitement of these innovations lies a significant and growing problem: energy consumption.

A recent report from the International Energy Agency reveals that electricity consumed for AI and cryptocurrency operations now accounts for two percent of global electricity production — and it’s about to get much worse. Over the next three years, the energy required to power AI data centers will surge more and more. Essentially, AI’s electricity consumption will equal that of a medium-sized country.

It’s not exactly clear where this power is supposed to come from, either.

No country currently has the infrastructure in place to meet the escalating energy demands of AI with clean energy alone. This gap between demand and sustainable supply is pushing many energy companies, particularly in the United States, toward drastic measures. Several utility providers are now considering extending the operational lifespan of coal-based power plants. Naturally, this will exacerbate the already dire climate crisis.

But there’s more.

AI Is Becoming A Bigger And Bigger Problem For The Climate. Can “digital Sobriety” Help?
Image credits: cottonbro studio/Pexels.

AI is coming, and it’s hungry

AI has already become a ubiquitous part of daily life, with platforms like ChatGPT, Midjourney, and Microsoft’s Copilot serving millions of users. In fact, ChatGPT alone reached 200 million weekly users within two years of its release. These tools offer unprecedented access to information and expression, but this all comes at a cost.

The negative impact of rising power demands due to AI is already evident. The world’s two biggest AI companies, Google and Microsoft, recently reported that their emissions have gone up by 48 and 29 percent respectively, over the past three to four years, due to their growing computing power needs to support AI operations.

“There is no basis to believe AI’s presence will reduce energy use, all the evidence indicates it will massively increase energy use due to all the new data centers, We know there will be small gains in efficiency in data centers, but the simple math is that carbon emissions will go up,” Co-chair of the Climate Disinformation Coalition at Friends of the Earth, told The Guardian.

At this point, however, shutting down AI data centers and getting people to stop using programs like ChatGPT is simply not an option.

Artificial intelligence seems inevitable. It has the potential to make our lives better, increase human productivity, and bring significant advancements in sectors like education and healthcare. However, at the same time, it’s equally important to monitor AI’s impact on the environment. We must implement strategies and innovations to control and reduce AI-driven emissions, and ensure that companies and people use AI responsibly. 

<!– Tag ID: zmescience_300x250_InContent_3

[jeg_zmescience_ad_auto size=”__300x250″ id=”zmescience_300x250_InContent_3″]

–>

“It is indeed crucial to explain to people what generative AI can and cannot do, and at what cost,” Sasha Luccioni, AI and Climate researcher at Hugging Face, told France24

We should be aware that every time you’re asking ChatGPT something, it’s having an impact on the planet.

The hidden cost of your AI queries

ChatGPT has reached 200 million weekly users within two years of its release. Many people spend hours interacting with this chatbot asking anything and everything that comes to their mind. 

However, most people don’t realize that the answer to each query comes with a cost. When you ask an AI chatbot a question, the data center powering it consumes 10 times more electricity than using Google or another search engine. 

“One query to ChatGPT uses approximately as much electricity as could light one light bulb for about 20 minutes. So, you can imagine with millions of people using something like that every day, that adds up to a really large amount of electricity,” Jesse Dodge, a research scientist at Allen Insititute for AI, told NPR

Similarly, when you use Midjourney to create an AI-generated photo, the servers powering the AI program consume enough electricity to charge your phone from 0 to 100. 

The environmental toll doesn’t stop there. AI data centers require vast amounts of water to keep servers cool. For example, generating a simple 100-word message on a platform like GPT-4 consumes over half a liter of water. Multiply that by millions of queries, and the numbers become staggering. One estimate suggests that if just 10 percent of U.S. professionals wrote a 100-word message weekly using AI, it would require over 430 million liters of water annually. That’s equivalent to the daily household water demand of an entire state like Rhode Island.

Also, ChatGPT is just one of the multiple AI chatbots. Google’s Gemini, Metal’s Llama, and Microsoft’s Bing AI also boast millions of active users who ask billions of questions monthly. Their data centers are also consuming large amounts of electricity and water. For instance, the annual water consumption of Llama 3 is likely to be around 22 million liters.

The drastic increase in water and electricity demand due to AI data centers could accelerate greenhouse gas and water crises simultaneously — leading to a climate emergency beyond our imaginings. 

So, what can be done?

Time to practice digital sobriety

Enter the concept of “digital sobriety.” This idea, gaining traction in parts of Europe, advocates for a more mindful approach to technology use. The goal is not to abandon digital tools but to use them judiciously, reducing unnecessary interactions with AI and minimizing its environmental footprint.

Preventing companies from building more AI data centers is nearly impossible in today’s world. AI chatbots are already serving millions of users, and tech giants are investing billions to enhance their models. Health researchers are pushing for AI’s use in disease detection, and businesses integrating AI into their operations are growing at a rapid pace, doubling each year. With such widespread demand and investment, the expansion of AI infrastructure seems inevitable.

Moreover, AI isn’t going to stay limited to your computers and smartphones. Many companies will try to package almost every product with an AI program to maximize their profits. The next generation of smart home appliances, industrial equipment, and toothbrushes is expected to have interactive AI programs. However, it’s up to you whether you really need an AI for everything.   

“I’m definitely not against having a smartphone or using AI, but asking yourself, Do I need this new gadget? Do I really need to use ChatGPT for generating recipes? Do I need to be able to talk to my fridge or can I just, you know, open the door and look inside? Things like that, right? If it ain’t broke, don’t fix it with generative AI,” Luccioni said.

For example, people can still use a regular search engine for most of their queries related to kitchen, travel, and pets. Asking AI about these topics may not make much difference. However, such small steps can definitely reduce their carbon footprint while companies and policymakers figure out ways to make AI operations sustainable.

“The idea here is not to oppose AI, but rather to choose the right tools and use them judiciously. In France, they have this term, “digital sobriety, it could be part of the actions that people can take as 21st-century consumers and users of this technology,” Luccioni said.

These simple shifts in behavior may seem small, but when multiplied across millions of users, they could make a substantial difference. The principle of digital sobriety encourages consumers to think critically about their use of AI technologies, cutting down on unnecessary digital consumption and, in turn, reducing their carbon footprint.

It requires a team effort

Ultimately, the responsibility for mitigating AI’s environmental impact shouldn’t only be on consumers. The corporations driving this revolution should bear the brunt of the responsibility. Companies like Google, Microsoft, and others must prioritize the development of greener data centers, incorporating renewable energy sources and improving energy efficiency, ensuring they only grow their AI ventures when it is sustainable to do so.

Of course, self-regulation won’t work. So we need policymakers to step up to ensure that AI’s growth is aligned with global climate goals. This might involve stricter regulations on the carbon footprints of tech giants, incentivizing the use of renewable energy, and investing in sustainable infrastructure to support the future of AI.

The path forward is not simple, but it is necessary. AI has the potential to transform the world for the better—but only if we recognize and address the very real environmental challenges it brings with it.