Complex Systems: Better Understanding the Risks
Monitoring the reliability and overall health of large, complex engineering systems—such as ships or nuclear power plants—can be tricky. These systems are designed to be resilient, and failures are rare. Their scale and complexity means they can’t be tested in the way that, for instance, a smaller component can, such as a light bulb.
A paper by Lewis, with Groth as co-author, now been published as the cover story in the latest issue of Algorithms. It is Lewis’s first publication.
“Complex engineering systems are large-scale systems that have human, hardware, and software components all interacting with each other,” Lewis said. “It can be very hard to determine how, when a single component fails, this impacts the system at large. We are using Bayesian modeling methods that take advantage of conditional probabilities and causal relationships to identify how those individual components—hardware, human, software—factor into the overall health of the system that we’re studying.”
Bayesian statistical methods start with a priori assumptions about the distribution of information about a system; these can then be updated as new information comes in, allowing for an evolving assessment of the system’s behavior. Combining these with graph theory allows the integration of information from multiple sources.
“The beauty of this approach is that we can take information from a wide range of sources, and not just an individual relationship,” Lewis said. “For example, if I were to just conduct light bulb sampling, I would need to test the same light bulb over and over again to make any assessments. But if I understand the relationships between temperature, pressure, or how these things interact, I can start to pull in information from outside and use to help determine whether or not the system fails or doesn’t fail. It’s those causal relations we can now consider: what happens when we have high temperature, high pressure, or both.”
Bayesian methods are particularly well-suited to the data-saturated environment in which most technologies now operate, Lewis said. From sensors to maintenance reports, multiple information streams are available—and can be fed into a Bayesian model.
An important feature of such models is their ability to select, from an abundance of information, that which is most useful for studying a particular system, Groth said.
“We have a lot of information but not all of it is relevant to certain types of problems,” she said. “What we’re trying to do is be greedy about how we get information and make sure we are getting the right information, but not so much information that we’re overwhelmed by it.”
“This is really a first step, but it’s an exciting first step,” Groth said. “We’re dealing with a high volume and variety of information sources and using them to conceptualize what is happening in a complex system. This Is something that’s never been done in our field.”
Published April 28, 2020