Students Dance Their Way Out of “AI Bias”
Class teaches students to consider emotions, facial expressions, cultural differences, cultural similarities and interactions when designing new technologies
Martin Brooke is no ordinary Engineering professor at Duke University. He teaches computer scientists, engineers, and technology nerds how to dance.
Brooke co-teaches Performance and Technology, an interactive course where students create performance projects and discuss theoretical and historical implications of technologies in performance. In a unique partnership with Thomas DeFrantz, a professor of African and African American Studies and Dance students will design a technology based on “heart,” for example, in order to understand how human expression is embedded in technology. Two weeks later, they’ll interact with motion-sensing, robotic trees that give hugs; and 3D printed hearts that detect colors and match people, sort of like a robotic tinder.
Brooke loves that this class is fun and interactive, but more importantly he loves that this class teaches students how to consider people’s emotions, facial expressions, cultural differences, cultural similarities and interactions when designing new technologies.
Human interface is when a computerized program or device takes input from humans — like an image of a face — and gives an output — like unlocking a phone. In order for these devices to understand human interface, the programmer must first understand how humans express themselves. This means that scientists, programmers, and engineers need to understand a particular school of learning: the humanities. “There are very, very few scientists who do human interface research,” Brooke said.
“There are very, very few scientists who do human interface research.”
Professor of ELectrical and Computer Engineering Martin Brooke
The students designed a robotic “Tinder” that changes colors when it detects a match.
Brooke also mentioned the importance of understanding human expressions and interactions in order to limit computer bias. Computer bias occurs when a programmer’s prejudiced opinions of others are transferred into the computer products they design. For example, many recent studies have proven that facial recognition software inaccurately identifies black individuals when searching for suspects of a criminal case.
“It turns out one of the biggest problems with technology today is human interface,” Brooke said. “Microsoft found out that they had a motion sensitive Artificial Intelligence that tended to say women, [more often than men], were angry.” Brooke said he didn’t consider the importance of incorporating the arts and humanities into engineering before coming to Duke. He suggested that it can be uncomfortable for some scientists to think and express themselves artistically. “[When] technologists [take Performance
and Technology], for example, they are terrified of the performance aspects of it. We have some video of a guy saying, ‘I didn’t realize I was going to have to perform.’ Yeah, that’s what we were actually quite worried about, but in the end, he’s there in the video, doing slow motion running on stage — fully involved, actually performing, and really enjoying it.
Duke has a strong initiative to promote arts and humanities inclusion in science, technology, engineering, and mathematics. Brooke plans to bring Bass Connections, a research program that focuses on public outreach and cross-disciplinary work, to his Performance and Technology class before the end of the semester to demonstrate bias through a program he calls AI Bias In the Age of a Technical Elite.
“You give it someone’s name and it will come up with a movie title, their role, and a synopsis of the movie. When I put in my name, which is an English name, it said that the movie I would be in is about a little boy who lives in the English countryside who turns into a monster and terrorizes the town.”
“You give it someone’s name and it will come up with a movie title, their role, and a synopsis of the movie,” Brooke said. “When I put in my name, which is an English name, it said that the movie I would be in is about a little boy who lives in the English countryside who turns into a monster and terrorizes the town.” This program shows even something as simple as a name can have so much stigma attached to it.
Brooke’s hope is that his class teaches students to think about technology and human interface. “Hopefully that’s a real benefit to them when they get out actually designing products.”