By the time you arrived on this page, AI probably already played a role in your day. Maybe you pulled up ChatGPT to write an email or fix faulty python code or diagnose the cause of your headache. Perhaps you used Adobe’s AI feature to extract the specific piece of information you needed from a 175-page report. Or, at the very least, AI algorithms likely directed you to this page either through the AI google search engine or backend processes (if you directly searched for IWMI features, you may not have used AI, but you did become our new favorite reader). This is all to say that AI has become accessible and undeniably integrated into our daily technological experiences.  

For researchers, it is no different. Researchers are immersing AI throughout their process to identify patterns, make predictions and automate decisions, processing more data and faster than humans and previous technologies ever could. 

But with great opportunity also comes great risk. Alongside AI’s rapid expansion comes apprehension about a technology that is greatly impacting the environment, particularly its substantial water consumption. By using AI, you are consuming vast amounts of virtual water hidden throughout the production process. Each time you come across AI, you consume water in multiple ways. 

Data centers use a vast amount of virtual water throughout their operations. Photo: Shutterstock
Data centers use a vast amount of virtual water throughout their operations. Photo: Gorodenkoff/Shutterstock

Part I: How water is used throughout data center systems 

For example, water is first introduced when manufacturing AI hardware. AI relies on silicon chips called Graphic Processing Units (GPUs), which are specialized for data processing, storage and high-performance computing. Assembling a GPU requires rinsing and cleaning the silicon “wafer” with ultra-pure water to prevent bacterial growth, clogs, and corrosion when this chip is ultimately packed together with over 100,000 others in a datacenter. On average, a single microchip will consume 8,300 liters of water throughout the manufacturing process.  

Beyond the water footprint of manufacturing GPUs, data centers continually consume water to allow these chips to function properly. Without mitigation, global water use associated with data centers could rise more than sevenfold by mid-century, with cooling-related operational consumption driving the majority of that demand. When you send a prompt to AI, the chips perform complex calculations to understand your request and generate a response, which creates significant heat buildup. Because of this, data centers need a proper cooling system to prevent the hardware from overheating. To cool the system, a coolant is piped over the processing chips. Once the coolant exits the servers, water is introduced to absorb the heat from the coolant so the coolant can be recirculated. Some of this water cools naturally, while fresh water replaces what evaporated during the process. In other cases, data centers use dry coolers, which avoid on-site water consumption but can increase energy demand, adding pressure on water resources indirectly through electricity generation in what is known as scope-2 water consumption. 

This indirect water use is significant. Data centers currently account for 1–2% of global electricity demand, a share expected to grow as AI models become more complex. The amount of water embedded in electricity varies widely by energy sources: solar, wind and geothermal require very little, while hydropower and fossil fuels demand greater volumes. For example, in Africa, hydropower produces just 13% of global electricity but accounts for 70% of water withdrawals and 97% of water consumed. Fossil fuels make up more than 80% of global primary energy consumption and their use in power generation alone accounts for over half of all water withdrawals in the energy sector. Moving toward low-water sources like wind and solar could cut withdrawals by half in the U.K., a quarter in the U.S., Germany and Australia, and by 10% in India helping to reduce the indirect water footprint of data centers. Yet as AI models grow, the demand for energy, and therefore water, continues to rise. 

More intensive calculations demand more energy and consequently, more cooling water. The most advanced AI training requires specialized cooling systems that can consume up to 5.5 liters of water per kilowatt-hour. Training GPT-3 alone consumed approximately 700,000 liters of water by facilitating these cooling mechanisms.  

While one ChatGPT query seems negligible, ChatGPT processes over 2.5 billion queries daily. A ChatGPT conversation involving 10-50 questions consumes about 500 milliliters of water, equivalent to a bottle of water. While this engagement with ChatGPT is intentional, also consider all the other AI queries automatically built into systems, such as search engines, that run without you explicitly requesting to use AI, which also increases the load required of data centers.  

Nearly all the water consumption attributed to major tech companies occurs at data centers, and their usage is increasing. Microsoft’s water consumption jumped 34% between 2021 and 2022, to nearly 6.4 billion liters. Google saw a 20% increase in the same period, and in 2023, Google worldwide operations consumed 24.2 billion liters, with 95% (23.1 billion liters) used by its data centers. Projections suggest AI-driven data centers could consume 4.2 to 6.6 trillion liters of water globally by 2027. That is slightly more than the amount of surface and groundwater used worldwide to irrigate bananas and plantains

Part II: The effects and response of water consumption at data center sites 

While these numbers are hard to grasp, the effects of unsustainable water consumption are not, especially for those in data center-hosting communities in drought-ridden areas. In Queretaro, Mexico, residents have been facing drought for years, exacerbated by multiple water-intensive sectors including agriculture, manufacturing and municipal demand that have consumed vast amounts of water for decades. More recently, the arrival of a data center has intensified existing pressures. A local shop owner reported that the health of the community wanes during water shortages without basic sanitation and that she has been forced to shut down her business for days on end. A blackberry farmer watched his crops shrivel, mirroring the slow retreat of the nearby Zimapán dam he depended on. All the while, the data center sustained its operations.  

In the US, Bloomberg News found that approximately two-thirds of new data centers are in areas already experiencing high water stress. Arizona is becoming the fastest-growing data center market in the U.S, yet much of the state is classified as experiencing “extreme drought.” This pattern is present worldwide with 270 datacenters in India, 80 in Turkey, 57 in UAE and 56 in South Africa — all countries facing national water scarcity.  

As data centers continue to expand in already water-scarce regions, climate change will heighten the strain they place on surrounding communities. To prevent further water stress, the placement of data centers must be better aligned with local resource availability. Suitability frameworks can help achieve this balance by guiding where data centers are built to safeguard both ecosystems and people. These frameworks typically take a systemic approach, considering local water supplies, usage patterns, and existing reuse and recycling systems. They also evaluate the distribution of benefits, ensuring that host communities are not only protected from depletion but are also compensated for their contributions and share in the advantages of the facilities they host.  

While optimally placing datacenters can reduce stress, so too can creating water-efficient data centers and manufacturing processes. The tech sector has begun exploring innovative solutions. For example, Microsoft has experimented with underwater data centers for natural cooling, Amazon has investigated space-based facilities, and Google’s use of DeepMind to optimize cooling cut energy use in its data centers by 40%. New water recycling systems at Google are also beginning to reuse datacenter water for continuous cooling cycles.  

There is reason for optimism that these energy and water challenges can be resolved, though the pace of innovation lags far behind AI’s expanding appetite. Solutions must be implemented rapidly and shared across sectors, as computer intelligence alone won’t sustain the water systems on which both technology and communities depend. 

To this point, technological progress has advanced with little structured water resource planning. It is important for us to reclaim agency in shaping how technology interacts with our world. By integrating AI and human intelligence, we can combine their strengths to generate insights that deepen our understanding of water scarcity and guide the design of systems capable of balancing the complex, shifting demands of both nature and technology. Through this approach, we can reorient our priorities to safeguard and restore the ecosystems on which we all depend. 

AI is not disappearing anytime soon. The moment you close this browser, you’ll return to a world saturated with it. While today’s AI systems have intensified environmental pressures, we have the potential to elevate these systems to reach a level of sustainability that we might never have achieved on our own. By aligning technological intelligence with natural systems, we have the opportunity to create technological progress that strengthens, rather than sacrifices, the world that makes it possible.