top of page
Disintegrating Sphere

Artificial Intelligence

We currently live in a rapid evolution of AI, where more and more people are using AI assistants like ChatGPT, DeepSeek, Copilot, and Grok, for queries ranging from research tasks to a written piece. However, have you ever wondered how much resource goes into generating one query?

Facts and numbers

"A request made through ChatGPT consumes 10 times the electricity of a Google Search, reported the International Energy Agency" (UNEP, n.d.)

"While global data is sparse, the agency estimates that in the tech hub of Ireland, the rise of AI could see data centres account for nearly 35 per cent of the country's energy use by 2026." (UNEP, n.d.)

"15 ChatGPT queries equate to the COâ‚‚ emissions of watching one hour of video streaming" (Smartly.ai, 2024)

"A ChatGPT query emits about 4.32 grams of COâ‚‚" (Smartly.ai, 2024)

"139 queries are roughly equivalent to the emissions from one load of laundry washed and dried on a clothesline" (Smartly.ai, 2024)

"The energy consumed by the world's data centres account for 2.5% to 3.7% of global greenhouse gas emissions, more than the aviation industry. It is estimated that training a large AI model can result in the emission of approximately 300 tons of CO2." (PlanBeEco, 2024)

"For frequent flyers, 92,593 queries would match the carbon footprint of a round-trip flight from San Francisco to Seattle" (Smartly.ai, 2024)

Environmental costs of AI: 
How is AI harming the environment?

This news article from Massachusetts Institute of Technology (MIT) explores how powerful generative AI models -- the large AI models we use that generate content (text, images, videos, etc.) -- focusing on how resource intensive they are, what impacts they have (direct and indirect), and what trade-offs or considerations are needed for responsible development.

Image by Geoffrey Moffett

We summarised some key points from the article:

  1. Electricity use – Training and running generative AI models consumes huge amounts of electricity; data center demand is rising sharply and could more than double by 2026.

  2. Water use – Cooling data centers requires large volumes of water, stressing supplies in some regions.

  3. Hardware impacts – GPUs and chips needed for AI have environmental costs from mining, manufacturing, and shipping.

  4. Short lifespans – Models are frequently replaced, so the energy invested in older versions is quickly “wasted.”

  5. Grid strain – Training causes sudden power spikes, often met with fossil fuel backup like diesel generators.

  6. Wider footprint – Impacts go beyond electricity: water, materials, supply chains, and deployment must all be counted.

  7. Trade-offs unclear – Benefits vs. environmental costs aren’t well measured yet; development is outpacing assessment.

AI's water usage
image.png
Screenshot 2025-09-01 at 4.48.09 PM.png

Check out BBC's More or Less podcast episode

  • About 80 to 90% of the water used is In the power stations themselves not cooling the servers

  • Chat GPT 3 Uses about 500 millilitres of water for 30 queries

  • Chat GPT 4, on the other hand, is 30x more powerful​

    • However, it is a closed source so harder to research

    • We should assume it is more consumptive

    • The creator (Open AI) stipulates it uses 1/15th of a teaspoon for an average query with a response of about 20 words

      • This is likely only counting the water used by the server farms themselves

  • French AI company Mistral stipulate they use about 45 millilitres of water per 2 to 300 word response

    • Ten responses is equivalent to a small bottle of water

Carbon footprint of
human research vs. AI generation:
Could AI use less than humans?

Nature article published in February 2024 on "The carbon emissions of writing and illustrating are lower for AI than for humans", compares the CO2 equivalent emitted by human work and AI-generated work in text writing and image creation. It was found that AI systems (ChatGPT, BLOOM, DALL-E2, Midjourney) produces 130-1500 times less COâ‚‚e per page than human writing; and produces 310-2900 times less COâ‚‚e per image than human-created art. (It should be noted that these calculations exclude broader social impacts such as job displacement, legal challenges, and rebound effects, and AI cannot replace every human activity.)

image.png
image.png

AI produces 130-1500 times less COâ‚‚e per page than human writing

AI produces 310-2900 times less COâ‚‚e per image than human-created art

AI use on the CGCH platform

With CfCA as a climate community, we need to be aware of the impact of incorporating AI into the Cross Government Climate Hub. We inquired with Circle, the community platform that the CGCH is on, on its incorporation of AI models onto the platform. Circle uses AI models provided by OpenAI to power certain features on the platform. OpenAI's infrastructure is hosted by Microsoft Azure, which has been carbon neutral since 2012 and is committed to being carbon negative by 2030. This means that the underlying data centres powering the AI capabilities are operated with sustainability at the forefront, utilising renewable energy sources and offsetting emissions. â€‹

 

While specific energy usage metrics for individual AI interactions aren't publicly available, OpenAI has expressed an ongoing commitment to improving the efficiency and environmental responsibility of its models. Additionally, Microsoft publishes annual sustainability reports detailing their carbon impact and progress toward their climate goals, which may offer further insight into the broader infrastructure we rely on.

image.png
image.png
Responsible use of AI

There are ways to responsibly use AI in our everyday lives and ways to make AI greener and reduce its carbon footprint.

Greener use of AI*

  1. Efficient Algorithms & Models: Develop optimised architectures and processors to cut energy use during training and operation, potentially reducing emissions by 100-1,000 times. 

  2. Greener Data Centres: Power data centres with renewable energy sources such as solar or wind instead of fossil fuels. 

  3. Advanced Cooling: Use more efficient cooling methods -- including experimental systems like Microsoft's underwater data centres -- to lower energy demand. 

  4. Sustainable Scheduling: Run energy-intensive tasks when renewable energy supply is high, reducing reliance on carbon-heavy electricity. 

  5. Model Optimization: Compress large models and use only essential data for training to reduce computational load. 

  6. Recycling & Reuse: Promote recycling and reuse of AI hardware to limit e-waste and resource extraction. 

  7. AI for Conservation: Apply AI to environmental challenges such as monitoring climate change, managing natural resources, and protecting biodiversity.  

*Sources include Smartly.AI and PlanBeEco

Abstract Sphere
Log large.JPG
Abstract Sphere
bottom of page