Training the Grid
Wise use of AI could be the key to our sustainability and survival.
AI is the technology of the day, and I have commented positively on its properties and potential many times in this column. Building AI into new products and services has become a marketing prerequisite. AI applications in the cloud offer easy access and fast answers to complex computing challenges, delivering great value for commercial organizations and novel services for consumers. And in devices such as PCs, smartphones and wearables, AI is the critical ingredient to enable the kinds of intuitive, human-like interactions people want to have with their tech today.
The AI in our smartphone cameras knows what our photographs should look like and adjusts the settings accordingly – a process that would take experts several minutes using Lightroom now happens inside our phones in milliseconds before we even see the picture. We even have bicycles that can warn the rider of a puncture with a tiny inertial sensor that integrates machine learning to monitor handling, giving the safety of a tire-pressure monitoring system (TPMS) without the expense of tire-pressure sensors.
But AI can be an energy-intensive way to do computing. Neural networks exploit both parallelism and depth to solve complex challenges faster than traditional sequential computers. And although trained models can be highly optimized and will process any given set of inputs only once, the training phase demands vast datasets that require large memory resources and involve processing samples multiple times to fine-tune the model parameters. The emergence of edge AI offers a response predicated on creating lightweight inference engines that use minimal layers and parameters to mimic the output of large, complex models. These can be tailored to the limited power budget of devices such as smartphones, smartwatches and automotive systems.
Despite these concerns about power consumption and other aspects of the technology, AI is essential to our future survival. Nokia says digitalization is essential for decarbonizing industries and accelerating sustainability. Its presentation at the COP28 summit in Dubai, No Green Without Digital, cited AI and other technologies like digital twins as important technologies in the drive to decarbonize.
As we seek sustainability through initiatives like electrification, powered by energy from renewable sources, we need these technologies to help us harvest enough energy to meet our needs and stabilize supply. We know that sources like wind, solar and hydro are weather-dependent and difficult to predict, so grid stability becomes more difficult to maintain. In the developed world, we have come to expect access to the power we need, when we need it. On the other hand, the transition to decentralized, distributed grid networks brings opportunities to improve the delivery of electricity in rural areas and the many regions of the world that could not afford to install traditional power infrastructures. Here, stability holds the key to improving healthcare, education and economic development. With AI to help us model the systems, we can establish the reliability we all need to support our electrified lives, as well as improve performance and efficiency.
If AI is to help us, we need to ensure that using the technology delivers a net benefit, and that the energy saved will exceed the energy needed to train and operate the models that run the grid, noting that data-intensive training will be ongoing as new capacity is added and as demands and usage patterns evolve.
Improvements in computing efficiency could work in our favor. Koomey’s law observes that computing efficiency, expressed as the number of compute operations per joule of energy, doubles every 1.57 years. Less well known than Moore’s law, it’s a trend that equates to a 100-fold increase in energy efficiency over a decade. Although this sounds impressive, our digital lifestyles and the ongoing digital transformation of our businesses demand more and more operations, so the overall energy demand continues to rise. This could help explain why our CO2 emissions are still increasing despite the huge engineering commitment to keep developing greener, energy-saving technologies like inverterized motor drives, LED lighting, photovoltaic cells, wind turbines, wide-bandgap semiconductors, to name a few.
The International Energy Authority reports that the increase in emissions is slowing, and attributes this to growing clean energy deployment. Indeed, without the shortfall in hydro generation caused by unfavorable weather conditions, the IEA’s figures show that global energy-related CO2 emissions in 2023 would have fallen.
Have we turned a corner? Juniper Research says our smart grids are becoming more effective, especially with technological improvements such as battery energy storage systems (BESS). Its latest report quantifies the potential benefits, suggesting we could save $290 billion in energy costs worldwide by 2029.
But let’s not congratulate ourselves just yet. The rapid growth in data centers, driven by exploding demand for their services, is causing wider concerns about sustainability. Energy demand is one aspect, while consumption of other resources like water for cooling those hard-working servers is also increasing rapidly. Usage is coming under scrutiny, particularly in areas affected by droughts. In response, big tech companies have committed to community projects to improve the local water supply, and some are adopting alternative cooling technologies.
We can anticipate an effective technological solution. It’s what engineers do. On the other hand, we may need to consider changing our always-on, energy-hungry lifestyles to ensure reliable, sustainable access to the services we all need.
Alun Morgan is technology ambassador at Ventec International Group (venteclaminates.com); alun.morgan@ventec-europe.com. His column runs monthly.