Through pioneering neuromorphic computing research, Yiran Chen is developing brain-inspired hardware neurons that could lead to faster, smarter and more energy‑efficient AI.
SHARE
Building Computers That Think Like Brains
When Yiran Chen imagines the next generation of computing, he doesn’t just see faster processors—he envisions machines that learn, adapt and reason the way people do.
Chen, the John Cocke Distinguished Professor of Electrical and Computer Engineering (ECE), recently received funding from the U.S. Department of Energy as part of a five-year, nearly $11 million multi-institution collaboration to help make that vision real. The federal investment at Duke is intended to push the boundaries of neuromorphic computing research, a field that could redefine how computers process information and consume energy.
The goal: to create and test hardware “neurons” that could one day power computers capable of mimicking biological brains.
“It’s a very ambitious goal,” Chen said. “We want to see if we can design hardware that behaves like biological neurons. If we can, the next question is: Can it scale?”
Lessons from Neuroscience
Even today’s most advanced artificial intelligence (AI) systems run on fundamentally inefficient architecture. Data must constantly shuttle between separate units for memory and processing, often across networks or into the cloud. The back-and-forth wastes time and energy.
Neuromorphic computing takes a different approach. Instead of imitating the brain through software, it seeks to build hardware that mirrors how the brain naturally works, where countless neurons work in parallel, each processing and storing information in the same location and communicating through intricate webs of synaptic connections.
For nearly two decades, Chen has been a leader in this field. His lab helped pioneer the memristor, a kind of electrical switch that “remembers” its state even when powered off. The memristor’s ability to store and process information simultaneously makes it the foundation of most neuromorphic hardware efforts today.
With better devices, knowledge and motivation, we can redefine what computing looks like for the next generation.
Yiran ChenJohn Cocke Distinguished Professor of ECE
Engineering the Brain in Silicon
In the new DOE award, three research teams will explore different technologies for constructing the “neuronal primitives”—the basic units of a brain‑inspired computer.
At Duke, Chen’s group is re-engineering conventional smartphone chips to behave like biological neurons. Collaborators at the University of Delaware and George Washington University are integrating emerging materials called ferroelectric field‑effect transistors and memristors to give these circuits the ability to adapt and learn.
Working with domestic semiconductor foundries, the team will prototype these three types of hardware neurons on real silicon wafers. Over the coming years, the team aims to progress from single‑neuron experiments to large-scale networks that can exchange signals and demonstrate simple learning.
“It’s a long journey from emulating one neuron to the brain’s 86 billion neurons,” Chen said. “But if we can get even tens or hundreds of hardware neurons to communicate and cooperate, that would mark a huge success and a foundation we can scale.”
Why It Matters
The potential impact of neuromorphic computing research extends far beyond energy savings. If successful, the technology could give scientists high-octane, low-power tools and provide entirely new ways for machines to learn and reason.
By closing the gap between memory and computation, neuromorphic circuits could tackle complex problems that defy traditional digital systems—from autonomous robotics to real-time scientific analysis in remote environments.
“It’s not just about mimicking the brain,” Chen said. “It’s about understanding how the brain achieves such efficiency and applying that understanding to build computers that can keep up with the data and intelligence demands of the future.”
Similar efforts funded by the federal government nearly two decades ago faltered when the technology couldn’t match the ambition. Chen believes the field is ready now, thanks to advances in materials, modeling tools and collaboration.
“This is a critical time,” Chen said. “With better devices, knowledge and motivation, we can redefine what computing looks like for the next generation.”