ChatGPT with GPT-4 uses approximately 519 milliliters of water, slightly more than one 16.9 ounce bottle, in order to write one 100-word email, according to original research from The Washington Post and the University of California, Riverside. This extravagant resource use can worsen human-caused drought conditions, particularly in already dry climates.
The Washington Post’s reporting is based on the research paper “Making AI Less “Thirsty”: Uncovering and Addressing the Secret Water Footprint of AI Models” by Mohammad A. Islam from UT Arlington, and Pengfei Li, Jianyi Yang, and Shaolei Ren of the University of California, Riverside. Reporters Pranshu Verma and Shelly Tan and their editing team used public information for their calculations of water footprint estimates and electricity usage as detailed in their article.
How much water and electricity are needed for ChatGPT?
The Washington Post and the University of California, Riverside examined the electricity needed to run generative AI servers and the water to keep those servers cool. How much water and electricity are used in specific data centers can vary depending on the climate in which those data centers are located. Washington state and Arizona have particularly heavy water draws.
In areas where electricity is cheaper or more plentiful than water, data centers might be cooled through an electric system instead of with water-filled cooling towers, for example.
Other findings include:
If one in 10 working Americans (about 16 million people) write a single 100-word email with ChatGPT weekly for a year, the AI will require 435,235,476 liters of water. That number is roughly equivalent to all of the water consumed in Rhode Island over a day and a half.
Sending a 100-word email with GPT-4 takes 0.14 kilowatt-hours (kWh) of electricity, which The Washington Post points out is equivalent to leaving 14 LED light bulbs on for one hour.
If one in 10 working Americans write a single 100-word email with ChatGPT weekly for a year, the AI will draw 121,517 megawatt-hours (MWh) of electricity. That’s the same amount of electricity consumed by all Washington D.C. households for 20 days.
Training GPT-3 took 700,000 liters of water.
In a statement to The Washington Post, OpenAI representative Kayla Wood said the ChatGPT maker is “constantly working to improve efficiency.”
SEE: Tech giants can obscure the greenhouse gas emissions of AI projects by factoring in market-based emissions.
More must-read AI coverage
How much electricity does it take to generate an AI image?
In December 2023, researchers from Carnegie Mellon University and Hugging Face found that it takes 2.907 kWh of electricity per 1,000 inferences to generate an AI image; this amount differs depending on the size of the AI model and the resolution of the image. Specifically, the researchers tested the energy consumption of the inference phase, which occurs every time the AI responds to a prompt, since previous research had focused on the training phase.
While The Washington Post’s reporting focused on the high cost of a relatively small AI prompt (an email), the cost of using AI for more rigorous tasks only increases from there. Image generation created the most carbon emissions out of all of the AI tasks the Carnegie Mellon University and Hugging Face researchers tested.
Over-reliance on AI can have negative impacts on both the Earth and the bottom line
Resource-hungry AI trades present-day profit for worsening drought and increasing pressure on the electricity grid. Generative AI can also drive customers away: an August ad for Google Gemini generated backlash from consumers. A July survey from Gartner found 64% of 5,728 customers would prefer not to encounter AI in customer service.
Organizations should find ways to incentivize long-term thinking when it comes to what technology employees choose to use day-to-day. Creating an environmental policy — and sticking to it — can increase customers’ trust in a business and help spread out profit into the long term.
“Many of the benefits from generative AI are speculative and may appear in the future as businesses rapidly explore diverse use cases that might spark broad adoption,” said Penn Engineering Professor Benjamin Lee in an email to TechRepublic. “But many of the costs from generative AI are real and incurred immediately as data centers are built, GPUs are powered, and models are deployed.”
“Businesses should be confident that, historically, a broadly used technology becomes more and more efficient as computer scientists repeatedly and incrementally optimize software and hardware efficiency over years of consistent research and engineering,” said Lee. “The problem with generative AI is that use cases, software applications, and hardware systems are all evolving rapidly. Computer scientists are still exploring the technology and there is no clear target for their optimizations.”
One way to mitigate the environmental impacts of AI is to run data centers on renewable energy wind, solar, hydroelectric power, or nuclear energy, said Akhilesh Agarwal, COO of supplier management firm apexanalytix, in an email to TechRepublic.
“It is crucial that companies deploying AI technologies remain mindful of the potential environmental costs if they fail to invest in sustainable practices, as unchecked AI growth could exacerbate global resource consumption issues,” said Agarwal.
On the other hand, AI can “optimize processes, reduce inefficiencies, and even contribute to sustainability efforts,” Agarwal said, and its impact should be measured against the carbon output of a human workforce performing the same tasks.
The post Sending One Email With ChatGPT is the Equivalent of Consuming One Bottle of Water appeared first on World Online.