Layer3 Achieves ISO/IEC 27001:2022 Certification

Everyone is talking about AI, online and in person, and for different reasons. Some are excited about the incredible possibilities it offers, from boosting workplace productivity to revolutionising entire industries. Others are casually curious about its capabilities, wondering how far it can go and what it means for their businesses. Then there are the concerns – will AI take away human jobs? How secure is it and will AI introduce new risks into the business? What happens when attacks, phishing and misinformation become even harder to detect?
But recently, I noticed a different kind of concern popping up on social media-one I hadn’t thought about before. These posts weren’t about AI replacing human workers or the risks of bad actors using it. They were about AI’s environmental impact, specifically its use of energy and water.
While I’m personally all about embracing the possibilities of technology, I also believe in practical, meaningful environmental protection. So, when I saw people claiming that every AI query is depleting Earth’s water resources, it caught my attention. Was this really a problem? How bad is it? And is there more to the story?
So, I decided to dig deeper. What I found-as is so often the case-is that the truth isn’t as simple as a one-liner on social media. AI’s environmental impact is a nuanced issue, and if businesses and organisations want to balance technological progress with sustainability, they need a clearer, fact-based understanding of what’s really happening.
This is the post that caught my attention. It makes some a bold statement that seeks to align incremental environmental damage to each individual ChatGPT query.
“Each time you ask ChatGPT a question, you deplete the Earth’s water resources. Our interactions with AI have a terrible, hidden cost.”
That’s an alarming thought, for sure. I care enough about to environment to be concerned but am savvy enough to be sceptical of a random social media post, so I took the time to digest the whole argument and then dig a bit deeper.
The slide deck in the post goes on to characterise AI as a serious environmental threat, particularly regarding the consumption of water used for cooling data centres. Screenshots of the slides are posted here in full, but here’s a summary:
That definitely paints a grim picture. So, let’s try to unpack these statements and see if they are accurate or otherwise.
As AI adoption grows, so do discussions about its environmental impact, particularly its water use. While datacentres do require water for cooling, the scale and context are often misunderstood, making it important to separate fact from assumption in this debate.
First, it’s absolutely true that AI-powered data centres consume water to cool their servers. The thing is, that so do traditional cloud computing centres, internet infrastructure, and even everyday electronics that need cooling.
This really needs some perspective.
Google searches, surfing YouTube or just ‘Netflix and chill’ all require server cooling, which means they also consume water. In fact, given that streaming video for an hour consumes significantly more energy than ChatGPT, the water usage would be proportionally higher. And while a ChatGPT query does currently use about 8x the water of a Google search, it should be noted that Google handles about 8x as many searches, which kind of evens things out, at least for now.
Manufacturing a single smartphone requires approximately 12,670 litres of water. With each ChatGPT query using about 500 millilitres, it would take around 25,740 queries to match the water footprint of producing one phone. If you made 5 queries per day, it would take over 14 years for your AI usage to equal the water consumed in making the phone you’re using to ask those queries. But of course, you would probably get a new phone 4 times in that period, quadrupling the water consumption for your devices.
So, while AI does have a water footprint, it’s not actually unique in that regard.
The claim that 5-50 ChatGPT queries use 500 millilitres of water is based on research from the University of California, Riverside. If true, this suggests that millions of AI queries per day do add up.
But context is everything. There is little New Zealand’s love more than their coffee. How much water does your morning coffee take? About 8 ounces? Well, according to the United Nations Food and Agriculture Organisation, it’s more like 140 litres. That right – from cultivating the beans to brewing the beverage, it takes about 140 litres to put that coffee in your hand, equal to 280 ChatGPT searches.
If you think that’s bad, check out the data on hamburgers. According to the Water Footprint Network, a single hamburger requires 2,498 litres of water to produce (from growing feed crops, raising the cattle and processing the meat to assembling the final product) – equivalent to hundreds of thousands of AI queries.
Does AI use water? Yes. Is it a catastrophic, hidden disaster? Not quite.
The post implies that AI-driven data centres are worsening water scarcity issues in places like California, India, and the Netherlands.
But it turns out to be stretching things a bit to lay that all on AI’s doorstep. In fact, many AI data centres are located in regions where water is abundant. Google and Microsoft for example, strategically build their facilities in cooler, water-rich areas to reduce reliance on scarce water sources.
The largest consumer of freshwater worldwide is in fact, agriculture, not tech. In researching this, I was gratified to learn that AI-powered smart irrigation systems are actually helping reduce global water waste.
The projection that AI could consume 6.6 billion cubic meters of water by 2027 is significant, but it’s important to remember that these projections are not certainties. It also does not account for further innovation.
Big Tech is actively working on cooling innovations, including air cooling, liquid cooling, and even submersion cooling, to cut down water use. Renewable energy adoption is reducing the reliance on traditional cooling infrastructure and water recycling systems are becoming more common in data centres, allowing for water to be filtered and reused.
For organisations interested in both AI and sustainability, the key takeaway is this:
AI is not a perfect technology for sustainability, but its environmental impact must be viewed in context.
If businesses want to mitigate AI’s water footprint while still leveraging its power, they can take the following steps:
Having good internal policies on AI usage will help us AI efficiently by prioritising high-value tasks, and minimising frivolous usage. Also, training staff in the ‘art of prompting’ will make their AI queries more effective and reduce the amount of corrections they need to run through the AI system.
Microsoft has committed to working toward 100% renewable-powered AI, as have Google and AWS.
Find ways to use AI within your business to reduce waste, streamline supply chains, and optimise energy consumption, lowering your overall environmental footprint.
The idea that “each ChatGPT query depletes the Earth’s water” is a simplistic take on a complex and nuanced issue.
Yes, AI has an environmental footprint. But so does every digital action, from sending an email to watching your favourite show on Netflix. The difference is that AI is also helping businesses, industries, and even environmental scientists find ways to be more productive and even more sustainable.
Instead of blindly sharing posts that paint AI as the boogeyman, take a step back, look at the data, and think critically, because the real world is rarely as simple as a catchy headline. Let’s instead focus on practical solutions like optimising AI’s efficiency, supporting sustainable AI providers, and using AI in ways that enhance, rather than harm, our planet.
Technology and sustainability are not at odds – they must evolve together.