
When a document is opened or a stored video is streamed, the main energy cost has already been incurred. The system is largely retrieving existing data.
As AI use expands, so does this underlying footprint. The environmental question, then, is not how individual prompts are phrased, but how frequently and intensively these systems are used.
The popularity of the “please” myth is therefore less a mistake than a signal. People sense AI has a footprint, even if the language to describe it is still emerging.
By contrast, each time an AI model is queried it must perform a fresh computation to generate a response. In technical terms, each prompt triggers a fresh “inference” – a full computational pass through the model – and that energy cost is incurred every time.
Yet AI infrastructure is often planned and assessed separately, as if it were merely a digital service rather than a persistent physical presence with ongoing resource demands.
Large data centres can place significant pressure on local grids and claims of renewable supply do not always correspond to new generation being added. Electricity used to run servers is electricity not available for other uses, particularly in dry years when hydro generation is constrained.
Why every AI query carries an energy cost
The idea sounds plausible because AI systems process text incrementally: longer prompts require slightly more computation and therefore use more energy. OpenAI’s chief executive Sam Altman has acknowledged it all adds to operating costs at the scale of billions of prompts.
The more consequential questions concern how AI infrastructure is integrated into energy planning, how its water use is managed, how its location interacts with land-use priorities, and how its demand competes with other social needs.
This matters for climate adaptation and long-term planning. Much adaptation work focuses on land and infrastructure: managing flood risk, protecting water quality, maintaining reliable energy supply and designing resilient settlements.
In some cases, that reorganisation produces more coherent and resilient arrangements; in others, it amplifies existing vulnerabilities. Which outcome prevails depends largely on whether the pressure is recognised early and incorporated into system design or allowed to build unchecked.
Taking that signal seriously opens the door to a more grounded conversation about how AI fits into landscapes, energy systems and societies already navigating the limits of adaptation.
Viewed through a systems lens, AI introduces a new metabolic load into regions already under strain from climate change, population growth and competing resource demands.
The scale of that demand is no longer marginal. Research published in the journal Science estimates that data centres already account for a significant share of global electricity consumption, with demand rising rapidly as AI workloads grow.
AI’s hidden environmental footprint
This is why AI behaves less like conventional software and more like infrastructure. Use translates directly into energy demand.
From a systems perspective, new pressures do not simply accumulate. They can drive reorganisation.
One structural difference between AI and most familiar digital services helps explain why this matters.
The International Energy Agency has warned that electricity demand from data centres could double by the end of the decade under current growth trajectories.
New Zealand offers a clear illustration. Its high share of renewable electricity makes it attractive to data centre operators, but this does not make new demand impact-free.
This is where discussion of AI’s environmental footprint needs to mature. Focusing on small behavioural tweaks, such as how prompts are phrased, distracts from the real structural issues.
Why the myth matters
But, like any infrastructure, it carries costs as well as benefits. Treating AI as immaterial software obscures those costs. Treating it as part of the physical systems we already manage brings them into view.
Energy, water, land and infrastructure are tightly coupled. Changes in one part of the system propagate through the rest.
At the same time, it is a stretch to suggest that treating ChatGPT politely comes at significant environmental cost. The effect of a few extra words is negligible compared with the energy required to operate the underlying data centre infrastructure.
What is more important, perhaps, is the persistence of the idea. It suggests that many people already sense AI is not as immaterial as it appears. That instinct is worth taking seriously.
Cut the words “please” and “thank you” from your next ChatGPT query and, if you believe some of the talk online, you might think you are helping save the planet.
None of this implies that AI should be rejected. AI already delivers value across research, health, logistics and many other domains.
Artificial intelligence depends on large data centres built around high-density computing infrastructure. These facilities draw substantial electricity, require continuous cooling, and are embedded in wider systems of energy supply, water and land use.
Electricity is only one part of the picture. Data centres also require large volumes of water for cooling, and their construction and operation involve land, materials and long-lived assets. These impacts are experienced locally, even when the services provided are global.

