How does AI improve grid performance? No one fully understands and that’s limiting its use – Utility Dive

Posted: November 17, 2019 at 2:33 pm

Just as power system operators are mastering data analytics to optimize hardware efficiencies, they are discovering how the complexities of artificial intelligence tools can do far more, and how to choose which to use.

With deployment of advanced metering infrastructure (AMI) and smart sensor-equipped hardware, system operators are capturing unprecedented levels of data. Cloud computing and massive computational capabilities are allowing data analytics to make these investments pay off for customers. But it may take machine learning (ML) and artificial intelligence (AI) to address new power grid complexities.

AI is a form of computer science that would make power system management fully autonomous in real time, researchers and private sector providers of power system services told Utility Dive. ML is a part of AI that passes human-supervised data analytics through preset or learned rules about the system to inform AI of normal and abnormal operational conditions.

"[D]ata management falls into 'crawl, walk, and run' categories, and most utilities are crawling in their use of data right now. AI for data management would be 'running.'"

Kevin Walsh

Transmission and Distribution Principal,OSIsoft

"Knowing when to use data analytics and when to use machine learning and AI are the fundamental questions utilities are asking," GE Digital VP for Data and Analytics Matt Schnugg told Utility Dive. Continuing to use an approach "that has been good enough for years" has merit, but new tools and capabilities may justify "turning to data scientists and cloud computing" and there are "parameters" for knowing how to choose between them.

The sheer volume of data is beginning to exceed human capabilities, but system operators often don't have the technology to deploy demonstrated AI and ML solutionsfor power flow management,researchers told Utility Dive. The mathematics of solutions are not yet fully understood, they acknowledged. The next big question may be whether system operators will risk ML and AI for results humans cannot yet provide or understand.

The value of putting power system data to work is increasingly evident. It has saved system operators time and customers money.And it is providing predictive infrastructure maintenance, which can reduce the growing frequency and duration of service interruptionsand help avoid major unintended cascading blackouts.

"Utilities have billions of dollars invested in hard assets and they use data to manage those assets," Kevin Walsh, transmission and distribution principal for data management specialist OSIsoft, told Utility Dive. "But data management falls into 'crawl, walk and run' categories, and most utilities are crawling in their use of data right now. AI for data management would be 'running.'"

Data management providers like OSIsoft help utilities assimilate data "and make it available across the enterprise to enable intelligent decisions based on what is actually happening in the field," Walsh said. "For 75% of what utilities are doing, outside of maybe forecasting or managing capacity, AI is at the infancy stage and there is no real use case."

"AI and machine learning shops are too often looking for problems to solve rather than addressing the very specific problems that utilities face and showing how machine learning might be the right solution."

Joshua Wong

CEO, Opus One Solutions

A data stream anomaly spotted by OSIsoft's PI System data analytics allowed the Alectra, Texas, municipal utility to defer a $3 million transformer replacement with a $100,000 transformer repair, he noted. Duke Energy and Sempra Energy use PI analytics for predictive maintenance.Data analytics "does most of what utilities need" without the costs and complexities of AI and ML, he said.

"AI and machine learning are the buzz with investors and the general public, but utilities' key concern is what any analytics will bring to their operations," Opus One Solutions CEO Joshua Wong agreed.

Opus One uses advanced physics and mathematical formulas that underlie the distribution system to build a "digital twin" of a utility's system, Wong told Utility Dive. Largely through analytics software, the twin is used to model and inform utility system operations, planning and market and business model design.

The most common use of ML is for five-minutes-ahead to day-ahead algorithms that do load and generation forecasting, he said.Utilities' legacy software technologies cannot run "the very large and interdependent data sets needed for the entire grid's power flow,"but forecasting requires only learning forward "from a single point in history without a lot of dependent phenomena."

Better algorithms can be built with a combination of AI tools and data analytics that correlate real data and learning, Wong said. "Machine learning's greatest impact will be in making correlations that teach the algorithm to understand how the voltage here affects the voltage there." Those correlations will enable "optimizing grid operations like dispatching battery storage or managing electric vehicle charging," said Wong.

Utility pilots and research simulations are beginning to show automation can already optimize some of those operations.

In the first pilot for what could eventually be an autonomous grid, Helia Technologies and Colorado electric cooperative Holy Cross Energy (HCE) are testing ML's battery dispatch capabilities.

Four houses on the HCE system are equipped with multiple distributed resources, including batteries. A Helia controller is predictingthe batteries' actual charge-discharge potential instead of the vendors' rated capabilities, Helia CEO Francisco Marocz told Utility Dive. That allows the houses to support optimal system power flow for HCE.

A similar pilot testing ML's ability to optimize battery dispatch proved successful for Colorado's United Power, National Rural Electric Cooperative Association Analytics Research Program Manager David Pinney told Utility Dive. "The machine learning algorithm was able to forecast the co-op's optimal dispatch of the various energy storage applications three days ahead."

Duke Energy is "beginning to use machine learning capabilities, especially in the areas of data analysis and predictive analytics," Duke spokesperson Jeff Brooks told Utility Dive in an email. There is still human supervision in Duke's remote switches and reclosers that enable reconfiguring and rerouting in response to outages, but some of it is done with "scripted algorithms and processes."

"Breakthroughs in computational and data processing capabilities make it possible for algorithms to learn through interactions with the grid environment."

Qiuhua Huang

Research Engineer,Pacific Northwest National Laboratory

Over the next three to five years, Duke data scientists plan to "develop and deploy" AI and ML that will more fully automate analytics, outage management and power flow, Brooks said.

Transmission system operators have been slower to move toward automation, but DOE-funded national lab research is now focused on ML algorithms that train neural networks to process system data, researchers told Utility Dive.

Neural networks are sets of algorithms designed to recognize and order patterns in analyzed data and ML can train them to assist transmission system operators facing sudden large voltage fluctuations, Pacific Northwest National Laboratory (PNNL) research engineer Qiuhua Huang told Utility Dive. In early-stage simulations, algorithms responded in milliseconds to prevent voltage instability and cascading outages.

"Breakthroughs in computational and data processing capabilities make it possible for algorithms to learn through interactions with the grid environment," he said. "ML observes historic and real time data and learns to produce good outcomes in a way that seems beyond human intuition."

ML is being used to train a different type of neural network to respond to a transmission line failure caused by a demand spike, weather event or cyberattack, Argonne National Laboratory research scientist Kibaek Kim told Utility Dive. In simulations, it has responded 12 times faster than human operators do today and adjusted the voltages "automatically from the trained model."

The goal is to take system operators "out of the loop" and leave "the decisions from end to end" to AI, National Renewable Energy Laboratory (NREL) research scientist Yingchen Zhang told Utility Dive. It will take time and trials because, unlike an easily discarded ML-selected Netflix movie choice, "the wrong power system decision could cause a blackout across the system and that is not acceptable."

The need for trials to validate reliability is a major reason ML and AI have seen little deployment, Kim said. Another is that, as Wong noted, utilities and system operators do not yet have the hardware and software to use them, although that is being resolved with new cost-effective access to cloud computing, he said.

More significantly, utilities are reluctant because researchers "do not fully understand the underlying mathematics"of the neural networks and "why they work so well," Kim said. "The understanding will come, but I don't know when."

Until then, an algorithm could encounter something unknown and respond incorrectly, he said.

It is clear ML works the way it was designed "because input provided to the algorithm is learned and performs as intended,"PNNL's Huang said. "But arguably we don't know exactly how a neural network selects from the inputs and comes to the final decision because there is so much complicated processing to reach that decision."

Research is now directed at the question, he added. The likely explanation is that neural networks are using "a totally different way of interpreting these equations with some higher-level logic."

While the question is being answered, power system operators must decide how to proceed.

Deciding whether ML and AI are needed to address operations long addressed with data analytics depends on two factors, GE Digital's Schnugg said. One is who the system operator is, and the other is what problem the operator wants to address.

ML algorithms "tend to have the most impact when modeling events that have occurred many times, rather than Black Swan events without a data pattern."

Matt Schnugg

VP for Data and Analytics, GE Digital

Guidelines for making the decision "are not canonical," but "there are parameters for when it is best to use AI and machine learning," he said. "First, you have to have a tremendous amount of data cleaned and ready for the algorithm to be trained and built. That is just table stakes. Access to the cloud is usually the most cost-effective way to have that."

Second, ML algorithms "tend to have the most impact when modeling events that have occurred many times, rather than Black Swan events without a data pattern," he said. GE's new Storm Readiness application is built on the history of repeated outages from storms. "Storm readiness is the output of the model. The more outages there are to study, the more accurate the model can be."

Third, modeling must pass the 'yeah, no, duh test' by solving a real problem, Schnugg said. "ML is not needed to predict the sun will rise tomorrow, but if a decision about something very data-rich that occurs repeatedly could lead to appreciably better performance, it is worthy of using AI and ML to build a predictive model."

There are two definitions for "better performance," he added. "It can be a more accurate prediction, or it can be saving time while achieving the same accuracy. An ML-based predictive model that automates a process or a series of decisions that would take a human much longer adds tremendous value."

In GE's new Storm Readiness product, ML algorithms build and train a neural network to learn the system's weather and performance data history. It can then predict 72 hours in advance where the storm will hit the system and what resources will be needed to address its impacts.

In contrast, its new Network Connectivity product relies entirely on traditional data analytics to manage transmission and distribution system assets. The objective is to optimize the utility's business activities, from hardware maintenance to truck rolls.

The GE Effective Inertia application is a hybrid tool that combines real time transmission system data analytics and an ML-based load and generation forecasting algorithm. It anticipates fluctuations in system inertia 24 hours in advance from momentary supply-demand imbalances caused by rising levels of variable renewables, and informs cost-effective reserve procurements to stabilize the fluctuations.

"The cloud has democratized access to data, and now it is the quality of the data and the quality of the question being asked that are most important," Schnugg said. "ML and AI are only part of the value. The biggest value is helping the utility solve its problem."

See more here:

How does AI improve grid performance? No one fully understands and that's limiting its use - Utility Dive

Related Posts