Xianyi Cheng: Going Beyond Human Dexterity

9/24/24 Pratt School of Engineering

Cheng’s work focuses on dexterous manipulation in robotics, emphasizing the need for robotic systems that can handle diverse complicated manual tasks

Xianyi Cheng: Going Beyond Human Dexterity

Xianyi Cheng will join Duke Engineering as an assistant professor in the Thomas Lord Mechanical Engineering and Materials Science Department beginning December 1, 2024. Her expertise in robotics complements the growing efforts the university is investing in the field.

Cheng’s research focuses on robotic manipulation, with a particular emphasis on enhancing robot dexterity, or the ability of robots to perform complex, physical tasks that require precise and adaptable motions.

And for Cheng, there’s a pretty large rift in the current state of robotics regarding this feature.

While artificial intelligence has made real advances in areas like art, language and engineering at large, robots still struggle with physical tasks that humans find simple—like cleaning, assembling complex objects in a factory and construction.

Most robots today lack the dexterity required to perform daily tasks autonomously, too. Cheng’s goal is to develop autonomous robotic systems that can handle these same kinds of tasks but with the same level of finesse as humans.

“We don’t see a lot of physical robots that can help people do very tedious daily tasks,” Cheng explained. “My ultimate research goal is to achieve this autonomy in robotic manipulation to enable our robots to be able to have dexterity that can do these repetitive tasks for us.”

Using videos as examples, Cheng demonstrates how humans can perform extremely intricate hand movements that require fast, reactive control. Simple tasks like arranging objects or picking up groceries involve complex motor skills that get overlooked in the process, but robots can’t currently replicate these skills.

“What I’m really trying to do is rethink and broaden the concept of robot dexterity. Past research has been focused on purely developing dexterous robotic hands to mimic human hands or to enable these robots to move. But in my thesis work, I’ve demonstrated that, for example, even the simplest robot can have dexterity if guided by intelligent algorithms.”

Cheng believes we’re missing the point when we limit dexterity to hands. Think of a person operating a bulky excavator to perform a precise movement—their actions show that the dexterity comes from the person actually guiding the tool, not just from the way the tool has been designed. For Cheng, dexterity isn’t just about having human-like hands but also about the forces driving those robotic actions. She challenges traditional approaches in robotics that aim to mimic the human hand with advanced mechanical designs.

“What I’m really trying to do is rethink and broaden the concept of robot dexterity,” Cheng explained. “Past research has been focused on purely developing dexterous robotic hands to mimic human hands or to enable these robots to move. But in my thesis work, I’ve demonstrated that, for example, even the simplest robot can have dexterity if guided by intelligent algorithms.”

One of Cheng’s most significant contributions is the development of planning algorithms that allow robots to autonomously generate dexterous skills. These algorithms enable robots to manipulate objects using contact interactions, which dictate how the robot touches and moves objects in its environment. She’s developed the first planning framework capable of solving a diverse set of dexterous manipulation tasks.

Cheng’s future goals hit on several key areas: whole-body dexterity, where she aims to enable robots to use their entire body (e.g., elbows, chest, or even hips) for manipulation, much like how humans use their entire body in various tasks, as well as collaborative robotics, where multiple robots can collaborate on a single task, enhancing their collective capabilities.

She also describes her vision of having dexterous manipulation skills that are general across tasks and transferable across different robot embodiments. 

At Duke, Cheng is interested in collaborating with the medical school on projects like prosthetic hands and assistive robots, which can help people with disabilities. During a performance for blind children, Cheng was inspired to develop technologies that could assist individuals with everyday tasks, like feeding, dressing or fetching, using robotic manipulation.

The university’s collaborative spirit is what also drew Cheng to join MEMS. “I found that engineers at Duke showed a lot of autonomy in their research,” she shared. “And it showed so much deep thinking that I found it really impressive and felt hopeful about future collaborations with them all.”

New Faculty at Duke Engineering

More MEMS News