New methods aim to detect subtle faults before they lead to catastrophic failure
A single crack in a bridge is often a subtle change, but may eventually result in the bridge collapsing if it goes undetected. Such phenomena can happen in a variety of systems—from single power line outages leading to blackouts to individual cases of a disease leading to an epidemic.
A team of researchers, led by Professor Venugopal Varadachari Veeravalli and Statistics Assistant Professor George Fellouris, want to create algorithms that automatically detect these small, yet significant, changes efficiently. They are working in collaboration with George Moustakides, a professor at the University of Patras in Greece and Rutgers University.
“We want to be able to detect a subtle change in the system that could later affect the entire operation of the system,” said Veeravalli, who is also affiliated with the Coordinated Science Laboratory. “For example, power systems are designed to be resilient to a few line outages, and so it is possible for a small number of line outages in the transmission network to go undetected since they do not affect the loads or the generators in a significant way. But such outages, if left undetected, could eventually lead to catastrophic failures and blackouts. So you want to be able to detect these outages as soon as possible, and take any necessary action to fix them.”
The team, which received more than $1.1 million from the National Science Foundation, will also investigate how the algorithms can detect the spread of disease that leads to epidemics.
“In the beginning, a disease may be prevalent only in particular countries or hospitals before it spreads out,” Fellouris said. “If you can detect it very early, and take appropriate actions in time instead of waiting for it to spread out, this can make a big difference.”
The power grid and epidemiology may seem like two very different fields, but the team is focusing on how to broadly analyze sources of information, whether it be measurements of the electric grid or hospital database information.
“Every system is modeled differently, but the core of our work is to detect, in real time, variations from the ‘business as usual’ model that describes each system,” Fellouris said. “We’re building algorithms than can adapt to various systems in order to quickly and accurately detect changes, while controlling the number of false alarms below a tolerable level.”
False alarms, like when the smoke detector unnecessarily goes off during dinner preparation, create unwarranted panic. Researchers aim to find the optimal tradeoff between false alarms and quick detection.
Although there is a large body of existing work in this field—known as quickest change detection—going back to the 1920s, there has been renewed interest in the field because of modern applications in areas such as cyber resiliency and healthcare.
“Applications drive the theoretical problems we’re trying to solve,” Veeravalli said. “And we need new algorithms for the new applications.”
Indeed, modern sensor systems have grown enormously, requiring more advanced algorithms to sift through the data to determine exactly when and where significant changes occurs.
“Modern systems generate a large amount of data streams that can be monitored in real time, and these subtle changes may be detectable by only a very small portion of these streams,” Fellouris said. “Our challenge is to construct scalable algorithms that are also good from a statistical point of view.”