Cybersecurity for Autonomous Systems

1/6/21 Pratt School of Engineering

Miroslav Pajic works on 'assured autonomy' for systems with high-level autonomy and low human control and oversight

faculty member and graduate students work on a project outside
Cybersecurity for Autonomous Systems

From the Duke Research Blog

Over the past decades, we have adopted computers into virtually every aspect of our lives, but in doing so, we’ve made ourselves vulnerable to malicious interference or hacking. I had the opportunity to talk about this with Miroslav Pajic, the Dickinson Family associate professor in Duke’s electrical and computer engineering department. He has worked on cybersecurity in self-driving cars, medical devices, and even US Air Force hardware.

Pajic primarily works in “assured autonomy,” computers that do most things by themselves with “high-level autonomy and low human control and oversight.” “You want to build systems with strong performance and safety guarantees every time, in all conditions,” Pajic said. Assured Autonomy ensures security in “contested environments” where malicious interference can be expected. The stakes of this work are incredibly high. The danger of attacks on military equipment goes without saying, but cybersecurity on a civilian level can be just as dangerous. “Imagine,” he told me, “that you have a smart city coordinating traffic and that… all of (the traffic controls), at the same time, start doing weird things. There can be a significant impact if all cars stop, but imagine if all of them start speeding up.”

Since Pajic works with Ph.D. students and postdocs, I wanted to ask him how COVID-19 has affected his work. As if on cue, his wifi cut out, and he dropped from our zoom call. “This is a perfect example of how fun it is to work remotely,” he said when he returned. “Imagine that you’re debugging a fleet of drones… and that happens.” 

In all seriousness, though, there are simulators created for working on cybersecurity and assured autonomy. CARLA, for one, is an open-source simulator of self-driving vehicles made by Intel. Even outside of a pandemic, these simulators are used extensively in the field. They’ve become very useful in returning accurate and cheap results without any actual risk, before graduating to real tests.

“If you’re going to fail,” Pajic says, “you want to fail quickly.”