By Shadow, Jointing.Media, in Yunnan, 2026-02-18
When we use ChatGPT, DeepSeek or Doubao for free, few consider that each seamless conversation on the other side of the screen leaves an environmental bill somewhere on our planet. This bill demands no immediate payment, yet it quietly accumulates in the form of carbon emissions, water consumption and electronic waste – becoming the heaviest “free cost” of the digital age.
I. The Paradox of “Free”: Why More Free Means More Pollution
In traditional economics, free models often stimulate boundless expansion of demand. The AI industry exemplifies this phenomenon. When marginal usage costs approach zero, users’ rational constraints vanish—we might generate ten responses to test a single joke, have AI repeatedly revise inconsequential copy, or even open multiple AI applications simultaneously for idle conversation out of curiosity.
Underpinning this behaviour lies a stark physical reality: every seemingly free AI interaction consumes tangible electricity within data centre servers. Research on US data centres estimates that a single simple AI conversation (such as interacting with a large language model) may indirectly consume half a litre of fresh water and generate corresponding carbon emissions (actual figures vary based on the energy mix and water cooling technology at the data centre’s location). When hundreds of millions of users engage in billions of such interactions daily, the cumulative effect becomes astronomical.
The rebound effect is vividly demonstrated here: the efficiency improvement of technology (the decline in computing power costs) should have reduced resource consumption. However, the free model has greatly lowered the usage threshold, which is one of the key reasons for the surging demand. But even if it is paid for, as long as the value created by AI exceeds the usage cost, the rapid growth of demand is an inevitable trend.
II. The Cost of Computing Power: The Energy Chain from Chips to Power Grids
The operation of AI relies on formidable computing power, underpinned by a comprehensive energy consumption chain. Take training a large model like GPT-3 as an example: a single training session consumes electricity equivalent to the annual usage of approximately 130 American households. Yet the cumulative effect during routine usage warrants greater attention—while individual inference operations consume minimal energy, multiplied by billions of interactions, they account for the vast majority of AI’s total energy consumption.
Every link in this chain compounds environmental burdens:
Chip manufacturing: Producing high-end GPUs constitutes an energy-intensive, high-pollution industry requiring substantial scarce minerals and ultra-pure water.
Data centre operations: Servers operating at peak capacity generate immense thermal energy, necessitating continuous electrical cooling.
Grid strain: The proliferation of AI data centres is altering regional electricity demand curves, compelling grids to maintain greater fossil fuel generation capacity.
Measured by the core metric of Power Usage Effectiveness (PUE), modern data centres can achieve values around 1.2 (indicating that for every kilowatt-hour used in computation, an additional 0.2 kWh is consumed for cooling and auxiliary facilities). Nevertheless, absolute PUE values are surging dramatically alongside the explosive growth in AI demand.
III. Invisible Emissions: From Carbon Footprint to Water Footprint
The environmental impact of AI extends far beyond carbon emissions. A comprehensive life-cycle assessment framework reveals additional ‘invisible costs’.
Operational carbon emissions represent the most immediately apparent component. Research indicates that for large model training, the usage phase accounts for approximately 96% of the climate change impact. However, relying solely on figures reported by technology companies can be profoundly misleading—analysis suggests that when factors such as green certificate purchases are factored in, reported emissions may understate their actual grid impact by as much as 662%.
Implicit carbon emissions prove even more insidious. During GPT-4 training, the hardware manufacturing phase accounted for 94% of ‘human toxicity (cancer)’ impacts and 81% of ‘freshwater eutrophication’ impacts. This signifies that when we utilise AI services, the bulk of the environmental cost is predetermined the moment the chips leave the factory.
Water footprint has only recently gained prominence as a metric. Data centre cooling consumes substantial freshwater resources, with this impact being particularly pronounced in water-scarce regions. In Spain, regulators have formally incorporated water consumption as a core metric for AI systems. Computational models indicate that AI’s water footprint comprises two components: on-site cooling water usage (Scope 1) and water consumption during electricity generation (Scope 2). Together, these constitute the ‘water cost’ of each AI interaction.
IV. The Discarded Future: The AI-Accelerated E-Waste Crisis
If energy consumption represents the present cost, electronic waste constitutes the future liability. The iteration speed of AI hardware is astonishingly rapid—the latest GPU may be superseded by a more powerful model within less than two years of its release. The primary environmental risks in modern chip manufacturing stem more from production processes (such as perfluorinated compounds and heavy metal wastewater discharge). While the content of ‘lead, mercury, and cadmium’ in advanced logic chips is strictly controlled through lead-free processes (like lead-free solder), where do discarded servers, chips, and supporting equipment ultimately end up?
A study led by the Chinese Academy of Sciences projects that generative AI will cumulatively generate between 1.2 million and 5 million tonnes of electronic waste between 2023 and 2030. This volume equates to the weight of millions of cars. Should this e-waste enter informal recycling channels, toxic substances released during open-air incineration or acid leaching to extract precious metals could severely contaminate soil and groundwater, posing direct health threats to local communities.
More concerning is the fact that manufacturing AI chips consumes vast quantities of scarce minerals. The extraction of these resources itself inflicts significant environmental damage, and once soldered onto circuit boards, their recovery rates are extremely low. Implementing circular economy strategies—such as recycling and reuse—could reduce this figure by 16% to 86%. However, in practice, rapid technological iteration and economic considerations often relegate recycling to a secondary option.
V. How to Settle the Accounts: The Evolution of Quantitative Methods
Faced with such complex environmental impacts, scientists are developing increasingly sophisticated calculation methods. The current mainstream framework is life-cycle assessment, which tracks the full environmental cost of AI systems from cradle to grave.
Calculating the environmental impact of a single AI interaction requires four steps:
Define boundaries: Should only the training phase be included, or should long-term inference services be incorporated? Should hardware manufacturing be considered?
Data collection: This includes electricity consumption for AI tasks (kWh), data centre PUE values, local grid carbon emission factors and water intensity, alongside hardware manufacturing emissions data.
Tool selection: Open-source tools like Carbontracker or CodeCarbon can automatically monitor energy consumption and estimate carbon emissions; more advanced research employs AI systems such as Chat-LCA to compress assessments traditionally taking weeks into hours.
Breakdown and aggregation: Calculate operational carbon, embedded carbon, water footprint, and electronic waste separately, synthesising these into a multidimensional environmental ledger.
Through these methods, researchers can derive concrete figures such as ‘a single AI query consumes half a litre of water,’ rendering abstract costs tangible.
VI. Conclusion: Confronting the Costs Behind Free Services
The environmental impact of free AI is not an anti-technology argument, but a reality we must acknowledge. As we enjoy technological conveniences, we simultaneously bear responsibility as ‘environmental accomplices’.
This does not mean abandoning AI development—quite the contrary, confronting the issue is the first step towards resolution. The way forward lies in: promoting greater use of renewable energy in data centres, enhancing the energy efficiency of chips, extending hardware lifespans, and establishing mandatory environmental disclosure systems. As users, consciously reducing unnecessary AI usage is itself a contribution.
After all, nothing is truly free. Behind every click, an environmental bill is being generated. This bill is signed by us all. Recognising this reality is not merely about restraining our individual clicks, but also about holding the tech giants accountable: Are they using green electricity? Have they optimised their algorithms? Are they recycling electronic waste? Only when individual awareness aligns with systemic transformation can we truly leave no regrets.
Note: This article is based on research into AI industry energy consumption models, life-cycle assessment methodologies, and environmental economics, aiming to stimulate public discourse on the environmental costs of digital technologies.
Translated by Youdao and DeepL
Edited by Jas

![[Recruiting 2011] Jointing.Media](http://jointings.org/eng/wp-content/themes/news-magazine-theme-640/cropper.php?src=/cn/wp-content/uploads/2012/06/123.png&h=50&w=50&zc=1&q=95)


