GEN-ZiNE

View Original

ChatGPT: From Greatly Prized Tool to Guzzling Precious Tanks

From Fall Quarter at UCLA

As a college student, there’s nothing I love more than free answers to assignments I don’t want to complete. Which is every assignment. My junior year, I was introduced to the world of Chat GPT, and of course, I made quick and necessary use of it. As a student, I still often request things such as: Chat, how do I balance this chemical equation? Chat, give me a summary of George Orwell’s 1984. Chat, give me hidden discounts for Brandy Melville. (No hidden discounts exist, in case you were wondering.) 

Having developed a sort of “Plan B” mindset about Chat GPT, I was comforted by the thought that  I could always turn to it if needed. Boom. Stress alleviated.

More recently, I was talking to a friend from high school who had just taken a seminar on Chat GPT. I found it remarkable it had gone from being pilot-tested to warranting intensive seminars so quickly. My friend explained to me that, according to his professor, Chat GPT uses about a bottle’s worth of water every time we ask it something. My friend encouraged me to stop using the bot altogether.

Credit The Washington Post

I was still utterly baffled — just for different reasons. My friend was on the morally correct side of this conversation. The choice was obvious: I should just stop using it. But I had become so used to having it there, like much of our tech, it was like chopping off an extra hand – a third hand that is immensely helpful. On top of that, when I did opt out of using it, I would just see the Chat GPT screens on many of the student laptops around me. So, what was the point? 

As a Gen-Z college student, I like to think I care about the environment. I want our planet to be the beauty it is now for many, many more years to come. I appreciate Earth’s colorful, thriving coral and sparkling lake days as much as the next person. But getting people, including myself and similar college students, to stop using Chat GPT completely was like trying to tear an iPad away from a toddler while they were eating.

Chat GPT requires so much water because of cooling processes for its data centers. Essentially, the prompts we ask require analyzing patterns across a large trove of already-existing human-made content. All of that computing takes a lot of electricity and generates a lot of heat. To keep servers cool, data centers pump water for evaporative cooling.

Credit OECD.AI Policy Observatory

The thing is, water usage depends on location. A data center in Wisconsin, where the climate is cooler and more humid, will use less water for cooling than one in Los Angeles, where temperatures are higher and water resources are already strained. In fact, according to a 2022 report by the Uptime Institute, data centers in hot, dry regions can consume up to five times more water than those in cooler climates. So, why aren’t all data centers in Wisconsin? Data centers need to be close to the people they serve to reduce latency and improve performance. A server in the Midwest isn’t ideal if you're trying to stream a 4K video in L.A. without running into issues with loading time. Plus, real estate costs, tax incentives, fiber connectivity, energy availability, and even natural disaster risks—like earthquakes in California or hurricanes in Florida—all play a role in site selection. This is why big data center hubs exist in places like Northern Virginia, Phoenix, and Dallas, even though water use varies dramatically in each location.

Thus, the problem becomes improving the existing cooling technologies — mostly evaporative — to accommodate less water usage. Even though more and more people use Chat GPT and other AI models daily. Which prompts more computing power. Which generates more heat. Which needs more cooling. But, this cycle doesn’t necessarily require more water. 

Credit Pew Research Center

In contrast to evaporative cooling, there is a new promising method emerging right now: Liquid Cooling.

Liquid cooling systems use significantly less water than evaporative cooling systems because liquid cooling operates in a closed loop. These systems reuse the same water continuously, while evaporative cooling constantly loses water through evaporation, requiring constant replenishment. In fact, liquid cooling can use up to 90% less water than evaporative cooling.

Largely established companies like NVIDIA, Dell, and Lenovo are taking a chance with liquid cooling, even as their AI platforms increasingly require more computing power. This shows promise for the effectiveness of the new cooling technology, and even greater potential for it to be used as an alternative for evaporative cooling technology. 

NVIDIA’s Blackwell platform, for example, uses direct-to-chip liquid cooling for their graphics processing units - or GPUs - and is actually able to actually operate with enhanced efficiency. Meaning less total power and cooling needs. NVIDIA doubles down on this development by projecting a whopping 30% reduction in power consumption and a sizable reduction in carbon emissions compared to evaporative cooling. Performance per watt (PPW) is key to all of this: new technologies will need to have high PPW to help the data sector transition to more power-demanding AI applications, which is the direction we seem to be moving in. 

Likewise, Lenovo's Neptune water-cooling system is also designed to more efficiently power AI. Its approach combines direct water cooling with rear-door heat exchangers, which eliminates heat right at its source, rather than transferring it like we see in evaporative cooling. They’ve projected energy savings of up to 40% and an even greater 95% decrease in cooling power usage, which is just another strong reason for the hype around liquid cooling. Taking its tech one step further than NVIDIA, Lenovo has even implemented Neptune in a vast number of operating servers and in server management settings, improving both their sustainability objectives and retaining their computational performance.

But what does this mean for the future of ChatGPT? We have to first acknowledge that society won’t stop forward progress for the sake of the environment. And unfortunately, most advancements have good and bad sides beyond their environmental footprint. Even with something as widespread and useful as the Internet (which we could never fathom life without) there are communities on the dark web who use it in immoral and illegal ways. ChatGPT also has similarly questionable uses like identity impersonation, creating fake news, cyberbullying, and so much more. With liquid cooling technologies, these problems with ChatGPT will not be solved. Regardless of the negative outcomes that ChatGPT can facilitate, people will continue to use it. So, I simply propose we take steps to make our use of it more sustainable, so as not to add to its negative impacts. Switching from evaporative cooling to liquid cooling will take years - perhaps decades - on top of the added time for the development and testing of the new technologies. But, it's better we learn and start investing in this transition now to minimize the time it can take. There is no start date on the journey to sustainability.