Sending a Single Email with ChatGPT is Like Drinking a Bottle of Water

According to research conducted by The Washington Post in collaboration with the University of California, Riverside, ChatGPT powered by GPT-4 consumes approximately 519 milliliters of water—slightly more than a standard 16.9-ounce bottle—to generate a 100-word email. This significant water usage can exacerbate drought conditions, particularly in arid regions.

This reporting is informed by the paper titled “Making AI Less ‘Thirsty’: Uncovering and Addressing the Secret Water Footprint of AI Models,” authored by Mohammad A. Islam from the University of Texas at Arlington, along with Pengfei Li, Jianyi Yang, and Shaolei Ren from UC Riverside. Journalists Pranshu Verma and Shelly Tan, along with their editing team, utilized publicly available data to estimate the water footprint and electricity usage mentioned in their article.

### Water and Electricity Usage for ChatGPT
The examination by The Washington Post and UC Riverside addressed the electricity required for operating generative AI servers and the water necessary to cool these systems. Water and electricity consumption varies depending on the climate of specific data centers. For instance, data centers in Washington state and Arizona are particularly high in water usage.

In regions where electricity is more affordable or abundant than water, cooling systems might instead rely on electrical methods rather than water-filled cooling towers.

Other notable findings include:

– If one in ten working Americans (around 16 million people) sends one 100-word email using ChatGPT weekly for a year, it would result in a staggering demand of 435,235,476 liters of water—equivalent to the daily water consumption of Rhode Island over a day and a half.
– Generating a 100-word email with GPT-4 requires 0.14 kilowatt-hours (kWh) of electricity, which is comparable to running 14 LED light bulbs for one hour.
– If the same 10% of working Americans were to send such emails weekly for a year, the electricity consumption would total 121,517 megawatt-hours (MWh)—equivalent to the electricity used by all households in Washington D.C. over 20 days.
– Training GPT-3 also consumed 700,000 liters of water.

In a statement to The Washington Post, Kayla Wood, a representative from OpenAI, indicated that the company is continuously striving to enhance efficiency.

### Energy Requirements for AI Image Generation
A study by Carnegie Mellon University and Hugging Face in December 2023 revealed that generating an AI image requires approximately 2.907 kWh of electricity per 1,000 inferences, with consumption varying by AI model size and image resolution. The research focused on the energy consumption during the inference phase, which occurs each time the AI generates a response, contrasting with previous studies that concentrated on the training phase.

While the costs highlighted by The Washington Post center on creating a relatively simple AI output like an email, the expenses significantly escalate for more intensive AI tasks, with image generation producing the highest carbon emissions in the tests conducted by Carnegie Mellon University and Hugging Face.

### Implications of AI’s Resource Demands
The resource-intensive nature of AI technologies poses challenges for both environmental sustainability and business profitability. Over-reliance on AI can lead to heightened drought conditions and additional strain on the electrical grid. Furthermore, excessive AI implementation could alienate consumers, as evidenced by backlash against Google Gemini’s advertising campaign. A survey from Gartner revealed that 64% of 5,728 respondents preferred avoiding AI in customer service interactions.

Organizations should promote responsible technology use by fostering long-term thinking in employees’ technology choices. Implementing and adhering to an environmental policy can improve customer trust and support sustainable profit growth.

“Many of the advantages provided by generative AI remain uncertain and may take time to materialize as companies explore various applications leading to wider adoption,” commented Benjamin Lee, a professor at Penn Engineering. “Conversely, the costs associated with generative AI are immediate as data centers are constructed, GPUs activated, and models rolled out.”

Lee further noted that while historical patterns indicate that widely adopted technologies become more efficient over time due to ongoing optimization, the rapidly changing landscape of generative AI presents a challenge, as the technology is still in the exploratory phase without a clear optimization trajectory.

To alleviate the environmental impact of AI, Akhilesh Agarwal, COO of supplier management firm apexanalytix, suggests utilizing renewable energy sources such as wind, solar, hydroelectric, or nuclear power for data centers. Agarwal emphasized the importance of adopting sustainable practices to mitigate the potential environmental repercussions of unchecked AI expansion.

On the positive side, AI has the capacity to optimize processes, minimize inefficiencies, and contribute to sustainability initiatives. Its benefits should be weighed against the carbon footprint of traditional human labor performing similar tasks.

Unlock your business potential with our expert guidance. Get in touch now!

NVIDIA Blackwell GPUs in High Demand: Sold Out! What’s on the Horizon?

Harvey Nash Report Reveals 50% of UK Tech Workers Intend to Leave Their Jobs

AMD Unveils New Lineup of Chips Designed for High-Performance AI Tasks

7 Factors Contributing to Call Center Burnout That Employers Can Tackle

Google Criticizes US Government’s Plan to Break Up Company Over Antitrust Concerns

Firefox Update Addresses Exploited Security Flaw

How Recorded Future Identifies Ransomware Victims Before an Attack Occurs