Protecting Military Personnel from Potential Augmented Reality Pitfalls

11/8 Pratt School of Engineering

Maria Gorlatova has won a Defense Advanced Research Projects Agency (DARPA) Young Faculty Award to protect people from a blind spot of augmented reality research

augmented reality surgical technology research at Duke University
Protecting Military Personnel from Potential Augmented Reality Pitfalls

Apple’s Vision Pro may not look quite as sleek as Tony Stark’s designer eyewear in Iron Man, but the impressive augmented reality (AR) headset will hit the shelves in spring 2024. These kinds of futuristic heads-up displays are quickly becoming reality—there’s even an AR contact lens being pursued by MoJo Vision, which claims it will produce a fully functioning prototype by the year’s end.

As is the case with many developing technologies, AR will also play a key role in protecting military personnel in dangerous situations. The Department of Defense is already testing prototypes of a platform they’re calling Integrated Visual Augmentation System, or IVAS for short, that promises to provide soldiers and squads with a fuller understanding of their operational environment.

For example, IVAS uses low-light and thermal sensors to improve target identification, integrates with ground and air platform sensors so soldiers can see outside of vehicles before dismounting into a hazardous situation, and provides 3D mapping and navigation capabilities augmented with data from unmanned aerial vehicles.

And, outside of combat situations, such systems could also give non-specialists simple instructions on how to fix common engine problems or how to provide lifesaving first aid in the heat of battle.

For all of the technical leaps this technology has made in recent years, however, there are very real concerns about how to ensure AR protects users rather than distracts them or, even worse, exposes them to malicious attacks.

“This is an understudied aspect of AR because of how difficult it is to gauge AR’s effectiveness to ensure that it’s saving lives,” said Maria Gorlatova, the Nortel Networks Assistant Professor of Electrical and Computer Engineering at Duke. “How does one objectively measure whether a tactical display is helping users or not?”

It’s a difficult task that Gorlatova, however, has been keen to pursue. Now, funded by a new Defense Advanced Research Projects Agency (DARPA) Young Faculty Award, she will spend the next three years designing methods and technologies for assessing whether specific elements of AR experiences are actually helping to protect users by helping them achieve their goals.

The objective of the DARPA Young Faculty Award (YFA) program is to identify and engage rising stars in junior research positions, emphasizing those without prior DARPA funding, and expose them to Department of Defense (DoD) needs and DARPA’s program development process.

The YFA program provides funding, mentoring, and industry and DoD contacts to awardees early in their careers so they may develop their research ideas in the context of national security needs. The long-term goal of the YFA program is to cultivate the next generation of academic scientists, engineers and mathematicians, who will focus a significant portion of their career on DoD and national security issues.

This is an understudied aspect, because of how difficult it is to gauge AR’s effectiveness to ensure that it’s saving lives.

Maria Gorlatova Nortel Networks Assistant Professor of Electrical and Computer Engineering

Gorlatova’s project will examine the process of AR experience design, testing and monitoring to bring forward two key approaches for protecting users of AR applications outside of the realm of a vehicle’s tactical displays. For example, machine learning algorithms can compare fields of view with and without AR overlays to determine if the graphics are blocking important details in the surroundings. She also plans to use eye-tracking software to monitor where and when a user’s attention is focused to help determine if the graphics are helpful or distracting.

As an initial test case, Gorlatova will develop these tools to examine the attention patterns of those using AR guides to complete a relatively simple and specific task—completing a sudoku puzzle. By studying changes in attention patterns and physical eye movement data as elements of the AR guide change, she hopes to create a set of metrics that is generalizable across many different critical tasks to ensure the platforms are protecting their users in high-stakes situations.

“There’s a distinct difference between the eye patterns of people using graphic overlays or trying to ignore them as well as between experts and novices at a specific task,” Gorlatova said. “Measuring usefulness like this requires a lot of data and is difficult to do, but I’m optimistic that we can get this right.”

Human-Machine Research at Duke

Close collaboration with industry and government partners enables Duke to discover new ways to secure systems, predict vulnerabilities and deploy countermeasures.

Related News

episode art: tv with qbit
3/19Podcast

Pop Quantum

The word “quantum” is quickly creeping into the lexicon of American culture. But what does it actually mean? And what does Chris Nolan get right that Marvel gets wrong? Members of the Duke Quantum Center have answers.