AI Is Thirsty
«Each chat with a large language-model is like dumping a bottle of water on the ground»
«But as it turns out, using ChatGPT consumes a lot of an unexpected resource:
«Why precisely does large-language-model AI require water? Back in April, a group of researchers pondered this question as they created an estimate of AI’s water consumption. As they note in their paper (which is here free in full), the main use of water is when tech firms train their AI, and when the firms are running inferences (i.e. when you, I or anyone else interacts with the model).»
Tech firms like Microsoft and Google and Meta do all that training (and inferring) on their huge computational farms. That computation requires a ton of energy, which generates heat. To remove that heat from server farms, the tech firms generally use cooling towers, where water is evaporated to send the heat out into the outside world. That evaporation? That’s how AI consumes water. It is, it’s worth noting, mostly all freshwater.
Tech firms do not publish specific stats on how much freshwater they use for different forms of computation. So the academics did some estimates. They calculated how much energy it would take to train one of the well-known large language-models (and…»
Extractivism is always thirsty anyway.
URL alternatif: https://scribe.froth.zone/ai-is-thirsty-37f99f24a26e.