You are here

Machine Learning Protecting Human Privacy

Duke Engineers are pioneering techniques to scan a person’s face while ignoring certain sensitive features that someone might not want their device to access

Have you ever wondered just how much information your phone learns about your identity when you use facial recognition to unlock it? In this era of ever-advancing technological growth, we see futuristic advances in technology paired with invasions of privacy. We are left with a difficult decision, grappling over to what extent we value our privacy over functionality. Researchers at Duke University are implementing machine learning in an attempt to solve this dilemma.

Machine learning is a burgeoning area of study that has been applied in many fields such as self-driving vehicles and personal assistant devices like Siri. Machine learning is the science of creating mathematical algorithms that allow computers to "think" and "act" like humans without the need for programming. Through machine learning, computers utilize data models to perform tasks solely based upon patterns and learned "reasoning," as opposed to a specific set of instructions.

Galen ReevesIn a collaboration within the Rhodes Information Initiative at Duke, Galen Reeves, assistant professor of electrical and computer engineering and statistical science, and Guillermo Sapiro, the James B. Duke Distinguished Professor of Electrical and Computer Engineering, tackled the problem of user privacy within facial recognition software. With their team, including PhD students Martin Bertran and Natalie Martinez, and Professor Miguel Rodrigues from University college London, they developed a way for a device to scan a person's face without recognizing certain sensitive features that someone wouldn't necessarily want their device to access, such as gender, ethnicity or emotion.

The group leveraged deep learning and information theory to solve the challenge of hiding unwanted information, say a person's gender, while still allowing a phone to recognize the individual trying to access it. In order to balance this trade off, they created both a "secret" variable (S), to hide gender, and a "utility" variable (U). These variables combined to give the observed data.Guillermo Sapiro

In the case of gender and facial recognition, these variables are of course highly dependent on one another. However, a certain mathematical transformation can preserve U while discarding S, and then input the remaining, obfuscated data to the device. The team also set up an optimization function to figure out just how much one could learn about S from observing the final data. This work was published in the Proceedings of the International Conference in Machine Learning (ICML '19).

Three rows of five famous faces, each becoming more obscured over the course of three vertical rowsThe motivation behind this research comes from a belief that privacy is a fundamental human right that is increasingly threatened in the digital age. According to Bertran and Martinez, this important application of machine learning can "help us preserve some measure of privacy but still allow us to participate and benefit from 'digital life.'" Instead of looking at worst-case scenarios in which the person attempting to violate the privacy of others has access to infinite information and computing resources, they chose a unique approach, prioritizing both the privacy and utility of an individual user. In the future, the team wishes to learn more about data sharing between users. This would aid in developing "a more precise measure of what the user expects as a utility to cater our algorithms to preserve this with greater fidelity."

Although this work was applied to hiding gender and emotion information, it provides a baseline for minimizing many kinds of sensitive information leakage. When asked what inspires their work, Bertran and Martinez replied, "We believe that privacy is a basic human right and that privacy in the digital age is rapidly eroding." Their research with machine learning techniques has taken significant steps toward easing these privacy concerns.

Chloe Derocher is a first-year undergraduate student planning to major in biomedical engineering or physics.