Strengthening sustainable artificial intelligence (AI)

Artificial intelligence (AI) is rapidly transforming industries from healthcare diagnostics to retail supply chains. Organizations looking to transform with generative AI expect to see a significant increase in the number of jobs by 2021. months of increased productivity, increased revenue, improved customer experience, and reduced costs. However, this rapid growth comes with a significant environmental burden that we can no longer ignore. The infrastructure powering the AI revolution consumes vast amounts of electricity and water, while also producing significant amounts of carbon, creating an ecological footprint that threatens the progress that AI is supposed to bring.

The Rising Costs of AI: The Efficiency Paradox

The rapid adoption of AI is causing the energy, emissions, and water costs of supporting large-scale AI to skyrocket. Left unchecked, AI’s environmental footprint could threaten corporate sustainability goals and exceed our planet’s limits. By 2030, data centers powering AI are projected to consume 612 terawatt hours (TWh) of energy per year, equivalent to Canada’s total annual electricity consumption in 2022. Carbon emissions from AI could account for 3.4 % of total global emissions, an 11-fold increase in a decade. Cooling these data centers could consume 3.02 billion cubic meters of freshwater, more than the total annual freshwater withdrawals of countries like Norway or Sweden.

The paradox is that while AI alone has enormous potential to reduce the carbon footprint of most companies, only 14% of % companies are currently using AI to reduce emissions. Traditional efficiency metrics like PUE (Power Usage Effectiveness) do not provide a holistic picture because they do not assess how well AI models are converting electricity, money, carbon and water into real intelligence and impact.

Introducing the Sustainable AI Quotient (SAIQ)

The Sustainable AI Quotient (SAIQ) was developed to bridge this gap. SAIQ is a multidimensional measure of how efficiently AI systems convert money, electricity, water and carbon into real performanceIt is calculated as the weighted sum of the following ratios: dollars per token, megawatt hours per token, tons of CO2 equivalent per token, and cubic meters of water per tokenThe lower the SAIQ, the more efficient and responsible the AI system is. This metric allows companies to balance financial viability, energy resilience, and environmental impact based on their unique organizational priorities.

Four Key Imperatives for Sustainable AI

To help companies maximize the return on their AI investments not only financially, but also in terms of energy, water, and environmental impact, a practical framework with four imperatives has been developed:

  • 1. Deploy Smarter Silicon: This is about running AI tasks on more energy-efficient hardware and models. This includes technologies like Compute-in-Memory (CIM) a Processing-in-Memory (PIM), which reduce data movement and power consumption. An example is Mythic with CIM technology, which reduces power consumption for AI inference by up to 20 times in edge devices. Neuromorphic systems, which mimic the human brain, consume energy only when active, resulting in significant savings. It is also recommended to use lightweight AI models and lower precision formats (e.g. Floating Point 8-bit, FP8) and deploy AI at the edge of the network (edge AI), which reduces cloud usage and improves performance and reduces carbon emissions.
  • 2. Decarbonize Data Centers: Hyperscalar data centers consume millions of gallons of water per day and are often powered by fossil fuels. Strategies include dynamic scaling and intelligent load balancing to adapt power consumption to AI workloads. Data center monetization and GPU sharing can also reduce carbon footprint. It is also key optimize data center location to take advantage of areas with cleaner energy or natural cooling, such as the Nordic countries. It is essential integration of low-carbon energy options, as they are small modular reactors (SMR), which provide a stable energy supply without dependence on fossil fuels. Finally, water-friendly cooling innovations, such as direct liquid cooling of chips and heat recovery systems, can minimize water and energy consumption.
  • 3. Use AI Thoughtfully: AI is often adopted based on hype rather than strategic benefit, leading to unnecessary inefficiencies. It is important to choose the right size AI models and favor task-specific models over large language models (LLMs). Moving from AI flat rates to pricing models based on usage or efficiency can also encourage optimization of AI resource consumption. In addition, AI should be used for decarbonization within industries, for example, for optimizing heating, ventilation and air conditioning (HVAC) systems or for predictive logistics.
  • 4. Embed AI Governance-as-Code: Rapid adoption of AI has outpaced governance frameworks, leading to inefficient implementations with a high carbon footprint. Companies should integrate AI sustainability into your operational processes and standardize real-time AI energy consumption monitoring. Automating AI management policies can help promote sustainability principles and manage environmental risks. Companies also have the opportunity help define AI standards and actively shape AI sustainability standards in collaboration with ecosystem partners.

The Path to a Sustainable AI Future

The future of AI depends on the ability to balance exponential growth with finite resources. Organizations that successfully manage this balance will transform AI from an energy-intensive ambition to a catalyst for sustainable value creation. This starts with building a robust measurement foundation, embedding efficiency into governance systems, and institutionalizing feedback and accountability mechanisms. Through continuous innovation in hardware, software, and policies, AI can be scaled sustainably, ensuring that the most intelligent systems of the future will run on renewable energy, transparent standards, and clear guidelines. Spring


Read the whole message


Glossary of key terms

    • Artificial Intelligence (AI): A broad field of computer science focused on creating intelligent machines that can perform tasks that would normally require human intelligence.
  • Sustainable AI Quotient (SAIQ): A new metric designed by Accenture that measures how efficiently AI systems convert money, electricity, water, and carbon into real-world performance. A lower SAIQ means a more efficient and responsible AI system.
  • Tokens: A standardized unit of AI performance used in SAIQ to measure AI efficiency relative to input resources.
  • Power Usage Effectiveness (PUE): The traditional data center efficiency metric, which measures how efficiently a data center uses energy, as the ratio of total energy consumed by the data center to the energy used by the IT equipment itself, does not take into account AI results.
  • Smarter Silicon: It refers to energy-efficient hardware and architectures for AI, such as AI-optimized processors, in-memory computing techniques (CIM/PIM), and neuromorphic computing that reduce power consumption.
  • Compute-in-Memory (CIM) / Processing-in-Memory (PIM): Technologies that process data directly where it is stored, minimizing costly data movement between memory and processors and significantly reducing energy consumption.
  • Neuromorphic systems: Hardware architectures that mimic the neural structures and functions of the human brain for more efficient information processing, often using spiking neural networks (SNNs).
  • Edge AI: Deploying AI applications directly on devices or local servers (at the edge of the network) instead of relying on cloud data centers. Reduces latency, cloud consumption, and carbon emissions.
  • Decarbonizing data centers: The process of reducing or eliminating carbon emissions from data center operations, often through dynamic scaling, intelligent load balancing, location optimization, integration of low-carbon energy options, and cooling innovations.
  • Dynamic scaling and intelligent load balancing: Adapting AI power consumption to workloads and shifting AI processing to times when electricity is cheapest and cleanest to reduce energy consumption and emissions.
  • Water management cooling innovations: Advanced cooling technologies for data centers, such as direct liquid cooling of chips, evaporative cooling, and heat recovery systems that minimize water consumption.
  • AI governance-as-code: Embedding sustainability principles into automated management systems to ensure AI models are deployed in line with environmental goals and regulatory requirements.
  • Carbon intensity: The amount of carbon dioxide produced per unit of electrical energy.
  • Large Language Models (LLM): Large, general AI models are often used for natural language processing, but can be energy-intensive. The paper recommends choosing AI models tailored to specific tasks.
  • Retrieval-Augmented Generation (RAG): A technique that combines large-scale language models with search engines so that the models access data only when needed, reducing the cost of inference.
  • Small Modular Reactors (SMRs): Small modular nuclear reactors that represent a reliable low-carbon option for powering AI data centers.
  • 24/7 Carbon Free Coalition: An initiative that brings together buyers to match every MWh consumed with local carbon-free electricity every hour of every day.
  • KPIs (Key Performance Indicators): Key performance indicators that traditionally track business goals, but should also include AI sustainability metrics.

- if you found a flaw in the article or have comments, please let us know.

You might be interested in...