Skip to main content

AI and data centres drink up billions of litres of water, experts say. This is why

Share

Artificial intelligence is making things easier in some industries, like health care and astronomy, but in other ways, AI can be harmful.

It's difficult to determine whether these types of technology do more harm than good, as some AI systems use a lot of energy, emitting a significant amount of carbon. Others can aid industries already harming the environment by amplifying their resources.

But do those harms outweigh the benefits, such as AI's use in the fight against climate change?

The impact of AI is not broadly understood and more research is needed, according to a professor who spoke to CTVNews.ca about the technology involved in AI models like Google's Bard and ChatGPT.

UNCLEAR HOW BAD SOME AI MODELS ARE

Teresa Heffernan, a professor at Saint Mary's University and an AI researcher, said those two models, popular for their abilities to interpret and answer questions and complete text-based tasks, are considered the most energy-intensive large language models.

While a user's interactions with these types of programs may seem minor in terms of environmental impact, the English language and literature professor says, "Large language models require massive amounts of computing energy, both in the training, (and) actually in the use."

And, Heffernan told CTVNews.ca in an interview, "there's a lack of transparency" when it comes to data.

Experts have attempted to calculate the impact of the use of LLMs.

A report published Monday by the Canadian Institute for Advanced Research (CIFAR) looked at how much carbon dioxide is produced when training these types of AI.

The training process is one of the most energy-intensive parts, the report says, because of the way this type of AI learns.

"LLMs will learn to predict the most probable word based on an input text, e.g., 'the cat sat on the _____' will result in the LLM returning 'chair' or 'mat,'" the report authored by Sasha Luccioni reads.

Over the past five years, specialized computer chips used for training LLMs have become more powerful, increasing the speed of learning, the report shows, but also using more resources.

Because electricity grids do not exclusively use renewable electricity, there are carbon emission costs associated with the training process.

There were three parts of AI training Luccioni looked at when calculating how carbon intensive some LLMs were:

  • the model training time (measured in hours);
  • the power used by hardware to train; and
  • the carbon intensity from the energy grid.

"Multiplying these three factors provides an initial estimate of what is called 'dynamic power consumption,'" the report reads.

According to Luccioni's research, Microsoft's GPT-3 emitted 502 tonnes of CO2 when in training. This is equivalent to the electricity-based emissions released by 304 homes over the course of a year.

Gopher, a 2021 LLM by DeepMind, emitted 352 tonnes of CO2, the research suggested.

And Luccioni's research showed that training is not the only process that requires a large amount of energy. Each time a query is answered by an LLM, it has a carbon impact.

A smaller algorithm studied by Luccioni, called Bloom, produced 19 kilograms of CO2 per day during its development.

"While this may not seem like a huge amount compared to the 50 tonnes (Bloom) emitted during model training, it adds up when LLMs like Bloom are deployed in user-facing applications like Web search and navigation, which can get queried millions of times a day," the report states.

And, she pointed out, this does not account for other environmental impacts costs such as cooling the building, storage of computers and the server network.

Another environmental impact of these programs is their depletion of fresh water reserves.

"When (an LLM is) churning through all this data, it's actually creating all this heat," Heffernan said. "In order for the thing not to melt down, you need to cool it. So that's where the use of water is going."

One study by Cornell University estimates that Google's data centres in the U.S. consumed 12.7 billion litres of fresh water for cooling the sites in 2021. One training centre from Microsoft GPT-3 consumed about 700,000 litres of fresh water, the same study estimated.

Cornell researchers also looked at similar AI programs that allow users to ask questions of a chat bot, and estimated that one such "conversation" with ChatGPT that involved roughly 20 to 50 words amounted to the equivalent of a 500-mililitre bottle of water for cooling purposes.

CTVNews.ca reached out to all the companies named in the report to ask how they plan to move forward sustainably. Only Microsoft provided a written answer in time for publication.

"As part of our commitment to create a more sustainable future, Microsoft is investing in research to measure the energy use and carbon impact of AI while working on ways to make large systems more efficient, in both training and application," a Microsoft spokesperson told CTVNews.ca in an email on Friday. "We will continue to monitor our emissions, accelerate progress while increasing our use of clean energy to power datacenters, purchasing renewable energy and other efforts to meet our sustainability goals of being carbon negative, water positive and zero waste by 2030."

MORE TO AI THAN JUST LLMS

The study focused on LLMs, but there are other types of AI that also have an environmental impact.

David Rolnick, a professor of computer science at McGill University, says predictive AI models are very different from LLMs in what sort of things they can do, but also the energy they use.

"It's not like one application of AI is making climate change better and making climate change worse at the same time," he said."If you use AI to understand where deforestation is happening, that's unambiguously making climate change better," Rolnick told CTVNews.ca in an interview.

But, he said, there are ways similar technology can do more harm.

"If you use AI, in oil and gas exploration to accelerate the discovery and extraction… that is arguably making climate change worse," he listed as an example.

Rolnick says to think of AI like a hammer.

"The impact of a hammer is how you use it. It's not how you make it," he said. "There are climate impacts from the making of AI algorithms and that depends on the algorithm. Most AI algorithms don't use very much energy."

AI that helps humans understand where deforestation is happening or predict where the next earthquake could strike is using algorithms and data from a select sample and not scrapping data from many areas.

"Many of the AI algorithms in practice — most of what we do in my own lab — they're very small, they could run on a laptop," Rolnick said. "LLMs are not that. We should definitely be thinking about that. But when we're talking about AI in fighting the climate crisis, we're not talking about LLMs, we're talking about other algorithms."

Most AI is not "flashy" like LLMs, Rolnick said, but it does have a huge impact on society.

"Whether that is the recommender systems that are deciding what ads you see online, or the control algorithms that are making factories run more efficiently, or the satellite imagery systems that are understanding where, how crops are growing, where buildings are around the world, to understanding the risk of natural disasters, or the algorithms that are used across the financial industry to forecast prices," he said.

AI is integrated in many industries already, and a lot of the algorithms are less energy-intensive than LLMs, he said.

"I think the entire field of computer sciences has let out a collective sigh over the past year as the world has grown enamoured and afraid of one particular set of AI technologies that are not the most impactful," he said.  

CTVNews.ca Top Stories

What a judge's gag order on Trump means in his hush money case

A gag order bars Trump from commenting publicly on witnesses, jurors and some others connected to the matter. The New York judge already has found that Trump, the presumptive Republican nominee for president, repeatedly violated the order, fined him US$9,000 and warning that jail could follow if he doesn't comply.

Local Spotlight