
The Cornell University study highlighted that AI data centers have a significant impact on water usage2. According to the research, AI demand may use up to 6.6 billion cubic meters of water in 2027. To put this into perspective, this amount is equivalent to the total annual water withdrawal of Denmark or half of the United Kingdom. The study also notes that this water consumption is in addition to the energy usage of data centers, which is already a significant environmental concern. The researchers suggest that the growing demand for AI technologies could exacerbate water scarcity issues in some regions and contribute to the global water crisis.

Small modular nuclear reactors (SMRs) developed by NuScale Power aim to address the energy needs of AI data centers by providing a continuous supply of carbon-free electricity6. Each NuScale Power Module can generate 77 megawatts of electricity, which could potentially allow data centers to operate independently of the grid4. The company is working with data center developer Standard Power to supply SMRs capable of generating nearly 2 gigawatts of electricity, enough for a midsize city. NuScale's SMR technology is designed to be safer and more flexible than traditional nuclear power plants, with the ability to scale the number of modules used according to the energy needs of the data center6.

A typical AI server box like Nvidia's H100 chip uses significantly more electricity compared to previous server models. Each H100 chip, loaded with eight GPUs and two CPUs, consumes about eight to 10 kilowatts of electricity. This is a substantial increase from previous servers, which would use a fraction of that amount. The energy consumption of the H100 chip is due to the growing complexity and computational demands of AI applications, which require more powerful and energy-intensive hardware to process and analyze vast amounts of data.