The rapid proliferation of artificial intelligence across industries has sparked urgent conversations about its environmental footprint. As organisations race to deploy AI solutions, the energy demands of training and running these systems have grown exponentially, raising critical questions about sustainability in the age of intelligent machines.
According to research published by the International Energy Agency, data centres worldwide consumed approximately 460 terawatt-hours of electricity in 2022, representing nearly 2% of global electricity demand. With AI workloads driving much of the growth in data centre capacity, this figure is projected to double by 2026, potentially reaching 1,000 TWh annually—equivalent to Japan’s entire electricity consumption.
The Carbon Cost of Computation
Training large language models requires extraordinary computational resources. A landmark study by researchers at the University of Massachusetts Amherst, published in 2019, found that training a single large AI model can emit more than 284 tonnes of carbon dioxide—roughly five times the lifetime emissions of an average car, including its manufacture.
More recent analysis paints an even starker picture. Research from Nature Machine Intelligence indicates that the computational requirements for AI training have been doubling every 3.4 months since 2012, far outpacing Moore’s Law. This exponential growth in compute demand translates directly into increased energy consumption and associated carbon emissions.
The environmental impact extends beyond electricity. AI hardware relies on rare earth elements and precious metals, the extraction of which carries significant environmental costs. A McKinsey analysis notes that semiconductor manufacturing requires substantial quantities of water—a single fabrication facility can consume 10 million gallons daily—along with chemicals whose production and disposal present additional environmental challenges.
Data Centre Infrastructure: The Hidden Footprint
The physical infrastructure supporting AI operations represents a substantial and often overlooked component of its environmental impact. Modern data centres require sophisticated cooling systems to prevent hardware failure, and these systems frequently consume more energy than the computational equipment itself.
Research from Gartner indicates that hyperscale data centres—the facilities most commonly used for AI training—have grown from 259 facilities globally in 2015 to more than 700 in 2023. Each of these facilities requires between 20 and 50 megawatts of power capacity, with the largest consuming over 100 megawatts.
Water consumption presents another critical concern. A study from the University of California, Riverside, published in 2023, estimated that training GPT-3 alone consumed approximately 700,000 litres of fresh water for cooling. The researchers noted that a conversation of 20 to 50 questions with ChatGPT requires roughly 500 millilitres of water—equivalent to a standard water bottle.
Geographic location significantly influences environmental impact. Data centres in regions powered predominantly by renewable energy produce far fewer emissions than those relying on fossil fuel-generated electricity. According to Google’s 2023 Environmental Report, the carbon intensity of electricity varies by a factor of 50 across different grid regions, making location decisions crucial for sustainable AI operations.
The Inference Challenge
While training receives substantial attention in environmental discussions, inference—the process of running trained models to generate outputs—may ultimately prove more consequential. Once deployed, AI systems serve millions or billions of requests, and the cumulative energy consumption of inference often exceeds that of initial training.
Analysis from Goldman Sachs suggests that a single ChatGPT query consumes approximately 2.9 watt-hours of electricity, compared to 0.3 watt-hours for a standard Google search—nearly ten times the energy requirement. With hundreds of millions of queries processed daily across various AI platforms, the aggregate consumption is substantial.
The inference challenge grows more acute as AI capabilities expand. Multimodal models processing images, video, and audio require significantly more computational resources than text-only systems. Bloomberg Intelligence projects that AI-related electricity consumption could account for 3% of global electricity demand by 2030 if current growth trajectories continue.
Hardware Lifecycle and E-Waste
The environmental impact of AI extends throughout the hardware lifecycle. Graphics processing units (GPUs) and tensor processing units (TPUs) used for AI workloads have relatively short operational lifespans, typically three to five years before replacement becomes necessary due to performance requirements or efficiency improvements.
The United Nations Global E-Waste Monitor 2024 reports that electronic waste reached 62 million tonnes globally in 2022, with only 22.3% formally collected and recycled. AI hardware contributes to this growing waste stream, and the specialised nature of AI accelerators makes recycling particularly challenging.
Manufacturing new AI hardware also carries significant environmental costs. Semiconductor fabrication is among the most energy-intensive and water-intensive industrial processes. A Taiwan Semiconductor Manufacturing Company report indicates that producing a single 300mm wafer requires approximately 2,200 gallons of ultra-pure water and substantial quantities of specialised chemicals.
Sustainable AI Practices: A Framework for Responsible Development
Organisations committed to sustainable AI development are adopting several strategies to reduce environmental impact. Model efficiency has emerged as a critical focus area, with researchers developing techniques to achieve comparable performance with significantly reduced computational requirements.
Research from Meta AI demonstrates that careful model architecture design can reduce training costs substantially. Their LLaMA models achieved performance comparable to much larger systems while requiring significantly less compute. Similarly, techniques such as knowledge distillation, pruning, and quantisation enable organisations to deploy smaller, more efficient models without sacrificing capability.
Transfer learning and fine-tuning represent additional efficiency strategies. Rather than training models from scratch, organisations can adapt pre-trained foundation models for specific tasks, reducing computational requirements by orders of magnitude. A Stanford University study found that fine-tuning approaches can achieve comparable performance to full training while reducing compute requirements by 99% or more.
Renewable Energy and Carbon-Aware Computing
Powering AI infrastructure with renewable energy represents the most direct path to reducing carbon emissions. Major cloud providers have made substantial commitments in this area. Microsoft has pledged to be carbon negative by 2030, while Amazon Web Services aims to power operations with 100% renewable energy by 2025.
Carbon-aware computing—scheduling workloads to align with periods of high renewable energy availability—offers additional emissions reduction potential. Research from World Resources Institute suggests that intelligent workload scheduling could reduce carbon emissions from AI training by 30% or more without affecting performance.
On-site renewable energy generation is gaining traction among organisations with significant AI infrastructure. Google’s approach includes direct power purchase agreements with renewable energy projects, ensuring additionality—the principle that corporate purchases drive new renewable capacity rather than merely reallocating existing supply.
AI as an Environmental Solution
While AI’s environmental footprint warrants serious attention, the technology also offers substantial potential for addressing environmental challenges. Applications range from optimising energy systems to accelerating climate research and enabling more sustainable industrial processes.
In energy systems, AI enables more efficient grid management and renewable energy integration. DeepMind’s collaboration with Google demonstrated that AI-driven cooling optimisation could reduce data centre cooling energy consumption by 40%. Applied at scale across industrial facilities, similar optimisation could yield substantial emissions reductions.
Climate modelling represents another high-impact application. NVIDIA’s Earth-2 initiative uses AI to accelerate climate simulations by orders of magnitude, enabling more accurate predictions of extreme weather events and long-term climate patterns. This improved understanding can inform adaptation strategies and policy decisions.
Agricultural applications of AI offer potential for significant emissions reductions. The UN Food and Agriculture Organisation notes that precision agriculture techniques enabled by AI can reduce fertiliser use by 20% or more while maintaining crop yields. Given that nitrogen fertiliser production accounts for approximately 2% of global energy consumption, this represents meaningful progress.
Regulatory Developments and Industry Standards
Regulatory frameworks addressing AI’s environmental impact are beginning to emerge. The European Union’s AI Act, which entered into force in 2024, includes provisions requiring transparency about the environmental footprint of high-risk AI systems. Similar requirements may follow in other jurisdictions as policymakers recognise the connection between AI governance and environmental sustainability.
Industry-led initiatives are also emerging. The Partnership on AI has established working groups focused on environmental sustainability, developing guidelines for measuring and reporting AI-related emissions. The ML CO2 Impact tool, developed by researchers at Mila, provides standardised methods for estimating the carbon footprint of machine learning experiments.
Strategic Implications for Organisations
For organisations deploying AI at scale, environmental considerations are becoming integral to strategic planning. The reputational risks associated with unsustainable AI practices are growing, as are regulatory compliance requirements. Proactive approaches to AI sustainability can differentiate organisations while reducing long-term operational costs.
Effective sustainability strategies encompass multiple dimensions. Organisations should conduct comprehensive assessments of their AI footprint, including training, inference, and hardware lifecycle impacts. Establishing clear metrics and targets enables progress tracking and accountability.
Procurement decisions increasingly incorporate environmental criteria. When selecting cloud providers or hardware vendors, organisations should evaluate their sustainability commitments and performance. The Green Algorithm framework provides a structured approach for assessing the environmental impact of AI systems, enabling more informed vendor selection.
Investment in efficiency research yields both environmental and economic benefits. Model optimisation techniques that reduce computational requirements also reduce infrastructure costs, creating alignment between sustainability objectives and business performance.
The Path Forward
The relationship between AI and environmental sustainability is complex and evolving. While AI systems consume substantial resources, they also offer powerful tools for addressing environmental challenges. Navigating this tension requires thoughtful approaches that maximise AI’s positive contributions while minimising its footprint.
Organisations leading in sustainable AI development are demonstrating that environmental responsibility and technological innovation can advance together. By adopting efficient development practices, powering operations with renewable energy, and applying AI to environmental challenges, these organisations are charting a path toward sustainable intelligent systems.
As AI capabilities continue to expand and deployment accelerates, the importance of sustainable practices will only grow. Organisations that embed environmental considerations into their AI strategies today will be better positioned to navigate regulatory requirements, meet stakeholder expectations, and contribute to a sustainable technological future.
Stay Ahead of the Curve
Get weekly AI insights, research updates, and strategic frameworks delivered to your inbox.