{"id":155571,"date":"2023-04-15T13:31:44","date_gmt":"2023-04-15T13:31:44","guid":{"rendered":"https:\/\/culture.org\/?p=155571"},"modified":"2023-04-15T13:31:44","modified_gmt":"2023-04-15T13:31:44","slug":"ai-water-footprint-the-thirsty-truth-behind-language-models","status":"publish","type":"post","link":"https:\/\/culture.org\/special-interest\/ai-water-footprint-the-thirsty-truth-behind-language-models\/","title":{"rendered":"AI Water Footprint: The Thirsty Truth Behind Language Models"},"content":{"rendered":" \r\n\r\n\r\n \r\n\r\n
<\/p>\n
While large language models (LLMs) like OpenAI’s ChatGPT and Google’s Bard have revolutionized the tech landscape, new research highlights their enormous water footprint.\u00a0<\/span><\/p>\n The training process for AI models like GPT-3 and GPT-4 requires immense amounts of energy and cooling, resulting in considerable water consumption.<\/span><\/p>\n Researchers from the University of Colorado Riverside and the University of Texas Arlington published a pre-print paper titled “Making AI Less ‘Thirsty.'”\u00a0<\/span><\/p>\n They estimated that GPT-3 consumed 185,000 gallons (700,000 liters) of water during its training.\u00a0<\/span><\/p>\n The water consumption would be even higher for newer models like GPT-4, which rely on a larger set of data parameters.<\/span><\/p>\n According to the study, an average user’s conversation with ChatGPT is equivalent to pouring out a large bottle of fresh water.\u00a0<\/span><\/p>\n As these AI models become more popular, their water consumption could have a significant impact on water supplies, especially amid the increasing environmental challenges in the US.<\/span><\/p>\n Integrating information from electricity providers and advancements in energy storage could further aid in distributing the training load based on clean energy availability.<\/p>\n <\/span>\r\n\r\n Data centers use massive amounts of water to cool down server rooms and maintain an ideal temperature for the equipment.\u00a0<\/span><\/p>\n Cooling towers, the most common cooling solution for warehouse-scale data centers, consume a significant amount of water.\u00a0<\/span><\/p>\n The researchers estimate around a gallon of water is consumed for every kilowatt-hour expended in an average data center.\u00a0<\/span><\/p>\n Data centers typically rely on clean freshwater sources to avoid corrosion, bacteria growth, and to control humidity.<\/span><\/p>\n Researchers suggest several ways to reduce AI’s water footprint, including adjusting when and where AI models are trained.<\/span><\/p>\n Training models during cooler hours or in data centers with better water efficiency can help reduce water consumption.\u00a0<\/span><\/p>\n Chatbot users can also engage with AI modules during “water-efficient hours,” similar to off-hours appliance use.<\/span><\/p>\n Federated learning strategies, which involve multiple users collaborating on training AI models using local devices, could also help decrease on-site water consumption.\u00a0<\/span><\/p>\n Integrating information from electricity providers and advancements in energy storage could further aid in distributing the training load based on clean energy availability.<\/span><\/p>\n The researchers call for greater transparency from AI model developers and data center operators in disclosing when and where AI models are trained.\u00a0<\/span><\/p>\n Such information would be valuable for both the research community and the general public.\u00a0<\/span><\/p>\n Acknowledging the training process and location could also help address AI water footprint concerns.\u00a0<\/span><\/p>\n As AI continues to advance, it is crucial for the tech industry to develop environmentally sustainable practices to minimize the water footprint of these revolutionary models.<\/span><\/p>\n","protected":false},"excerpt":{"rendered":"","protected":false},"author":18,"featured_media":155572,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_acf_changed":false,"footnotes":""},"categories":[411],"tags":[],"class_list":["post-155571","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-special-interest"],"acf":[],"_links":{"self":[{"href":"https:\/\/culture.org\/wp-json\/wp\/v2\/posts\/155571"}],"collection":[{"href":"https:\/\/culture.org\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/culture.org\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/culture.org\/wp-json\/wp\/v2\/users\/18"}],"replies":[{"embeddable":true,"href":"https:\/\/culture.org\/wp-json\/wp\/v2\/comments?post=155571"}],"version-history":[{"count":0,"href":"https:\/\/culture.org\/wp-json\/wp\/v2\/posts\/155571\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/culture.org\/wp-json\/wp\/v2\/media\/155572"}],"wp:attachment":[{"href":"https:\/\/culture.org\/wp-json\/wp\/v2\/media?parent=155571"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/culture.org\/wp-json\/wp\/v2\/categories?post=155571"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/culture.org\/wp-json\/wp\/v2\/tags?post=155571"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}ChatGPT’s Growing Water Consumption<\/strong><\/h2>\n
\r\n \r\n \r\n \r\n
Cooling Data Centers and the AI Water Footprint<\/strong><\/h2>\n
Addressing the AI Water Footprint Problem<\/strong><\/h2>\n
Transparency and Accountability<\/strong><\/h2>\n