Despite its name, the infrastructure used by the “cloud” accounts for more global greenhouse emissions than commercial flights. In 2018, for instance, the 5bn YouTube hits for the viral song Despacito used the same amount of energy it would take to heat 40,000 US homes annually.
Large language models such as ChatGPT are some of the most energy-guzzling technologies of all. Research suggests, for instance, that about 700,000 litres of water could have been used to cool the machines that trained ChatGPT-3 at Microsoft’s data facilities.
Additionally, as these companies aim to reduce their reliance on fossil fuels, they may opt to base their datacentres in regions with cheaper electricity, such as the southern US, potentially exacerbating water consumption issues in drier parts of the world.
Furthermore, while minerals such as lithium and cobalt are most commonly associated with batteries in the motor sector, they are also crucial for the batteries used in datacentres. The extraction process often involves significant water usage and can lead to pollution, undermining water security. The extraction of these minerals are also often linked to human rights violations and poor labour standards. Trying to achieve one climate goal of limiting our dependence on fossil fuels can compromise another goal, of ensuring everyone has a safe and accessible water supply.
Moreover, when significant energy resources are allocated to tech-related endeavours, it can lead to energy shortages for essential needs such as residential power supply. Recent data from the UK shows that the country’s outdated electricity network is holding back affordable housing projects.
In other words, policy needs to be designed not to pick sectors or technologies as “winners”, but to pick the willing by providing support that is conditional on companies moving in the right direction. Making disclosure of environmental practices and impacts a condition for government support could ensure greater transparency and accountability.
The point they are trying to make is that fresh water is not a limitless resource and increasing usage has various impacts, for example on market prices.
The point being made is that resources are allocated to increase network capacity for hyped tech and not for current, more pressing needs.
Is there a reason it needs to be fresh water? Is sea water less effective?
corrosion
Oh makes sense.
Not just corrosion, but also to prevent precipitation in evaporative cooling systems (the most common ones).
Evaporative systems require constant input of new water; if you’re adding saltwater the salt will concentrate and it’ll become a saturated brine, and once the brine evaporates a bit the salt precipitates. It’ll happen mostly on the cooling fills (that will need to be replaced more constantly), but the main issue is that some precipitate does get carried by the brine and clogs the pipes.
A lot of industry does use grey water or untreated water for cooling as it’s substantially cheaper to filter it and add chemicals to it yourself. What’s even cheaper is to have a cooling tower and reuse your water, in the volumes it’s used at industrial scales it’s really expensive to just dump down the drain (which you also get charged for), when I worked as a maintenance engineer I recall saving something like 1m cad minimum a year by changing the fill level in our cooling tower as it would drop to a level where it’d trigger city water backups to top up the levels to avoid running dry, and that was a single processing line.