Welcome Your New Lab Mate: Artificial Intelligence

10/24/25 Pratt School of Engineering

Duke engineers built an AI optical microscope that analyzes 2D materials as precisely as human experts.

Haozhe "Harry" Wang and Jingyun "Jolene" Yang lean over next to a microscope in a lab.
Welcome Your New Lab Mate: Artificial Intelligence

Haozhe “Harry” Wang’s electrical and computer engineering lab at Duke welcomed an unusual new lab member this fall: artificial intelligence.

Using publicly available AI foundation models such as OpenAI’s ChatGPT and Meta’s Segment Anything Model (SAM), Wang’s team built ATOMIC (short for Autonomous Technology for Optical Microscopy & Intelligent Characterization)—an AI microscope platform that can analyze materials as accurately as a trained graduate student in a fraction of the time.

“The system we’ve built doesn’t just follows instructions, it understands them,” Wang said. “ATOMIC can assess a sample, make decisions on its own and produce results as well as a human expert.”

Published on October 2 in the journal ACS Nano, the findings point to a new era of autonomous research, where AI systems work alongside humans to design experiments, run instruments and interpret data.

Haozhe

We still need humans to interpret what the AI finds and decide what it means. But once you have a partner that can complete weeks of analysis in mere seconds, the possibilities for new discoveries are exponential.

Haozhe “Harry” Wang Assistant Professor of Electrical and Computer Engineering

How ATOMIC Works

Wang’s group studies two‑dimensional (2D) materials, crystals only one or a few atoms thick that are promising candidates for next-generation semiconductors, sensors and quantum devices. Their exceptional electrical properties and flexibility make them ideal for electronics, but fabrication defects can compromise these advantages. Determining how the layers stack and whether they contain microscopic defects requires laborious work and years of training.

“To characterize these materials, you usually need someone who understands every nuance of the microscope images,” Wang said. “It takes graduate students months to years of high-level science classes and experience to get to that point.”

GIF of image processing pipeline automatically moving between different overlays.
ATOMIC sorts and analyzes microscope images on its own.

To speed up the process, Wang’s team linked an off‑the‑shelf optical microscope to ChatGPT, allowing the model to handle basic operations like moving the sample, focusing the image and adjusting light levels. Layered on top was SAM, an open‑source vision model designed to identify discrete objects, which in the case of materials samples would include regions containing defects and pure areas.

Together, the two AIs formed a powerful tool in the lab, a kind of virtual lab mate that could see, analyze and act on its own.

Turning general-purpose AI into a reliable scientific partner, however, required significant customization from the Wang lab. SAM could recognize regions within the microscopic images, yet it struggled with overlapping layers, a common issue in materials research. To overcome that, they added a topological correction algorithm to refine those regions, isolating single-layer areas from multilayer stacks.

Finally, the team asked the system to sort the isolated regions by their optical characteristics, which ChatGPT could do autonomously.

Jingyun

ATOMIC can see on a pixel-by-pixel level, making it a great tool for our lab.

Jingyun “Jolene” Yang PhD Student in Duke ECE

The results were remarkable: Across a range of 2D materials, the AI microscope matched or outperformed human analysis, identifying layer regions and subtle defects with up to 99.4 percent accuracy. The system maintained this performance even with images captured under imperfect conditions, such as overexposure, poor focus or low light, and in some cases spotted imperfections invisible to the human eye.

“The model could detect grain boundaries at scales that humans can’t easily see,” said Jingyun “Jolene” Yang, a PhD student in Wang’s lab and first author on the paper. “It’s not magic, however. When we zoom in, ATOMIC can see on a pixel-by-pixel level, making it a great tool for our lab.”

By locating and categorizing microscopic defects, the system helps Wang’s group determine the number of layers in a 2D material and pinpoint pristine regions suitable for follow‑up studies. Those high‑quality areas can then be used for other research in Wang’s lab, such as soft robotics and next-generation electronics.

Even more impressive, the system required no specialized training data. Traditional deep‑learning approaches need thousands of labeled images. Wang’s “zero‑shot” method leveraged the pre‑existing intelligence of foundation models, trained on broad swaths of human knowledge, to adapt instantly.

AI at Duke Engineering

Duke Engineering researchers are developing and deploying the power of computing to design autonomous systems, improve communications, glean useful insights from masses of data, detect disease and improve health, and enhance security in our world and cyberspace.

What It Means for Researchers

For Wang, his excitement about the innovation isn’t just about speed. It’s also about teaching his students to use the technologies at their disposal to be modern-day researchers.

“In the last year, AI has advanced a lot and Dr. Wang said if we do not embrace this era and make use of these AI tools, they may replace us,” Yang said. “We tested the ATOMIC system on many samples and different conditions, and it’s quite robust.”

Wang sees potential applications across disciplines, from chemistry to biology, where tedious optical analysis often slows progress. Simplifying those workflows could open advanced research to students, industry engineers or anyone with curiosity and a microscope.

At the same time, Wang stresses the importance of keeping humans in the loop. Foundation models can behave unpredictably, sometimes generating different results for identical prompts. His group tested thousands of repetitions to assess robustness and found that while minor variations occur, overall accuracy remains high.

“The goal isn’t to replace expertise; it’s to amplify it,” Wang said. “We still need humans to interpret what the AI finds and decide what it means. But once you have a partner that can complete weeks of analysis in mere seconds, the possibilities for new discoveries are exponential.”

The work is led by Duke in collaboration with Massachusetts Institute of Technology, National University of Singapore, University of California, Berkeley, Australian National University and Google DeepMind. Wang and Zhichu Ren are corresponding authors, and Yang and Ruoyan Avery Yin are the co-first authors. Wang and Yang acknowledge the GCP Credit Award from the Google PaliGemma Academic Program.

CITATION: Yang, J., Yin, R. A., Jiang, C., Hu, Y., Zhu, X., Hu, X., Kumar, S., Holmes, S. K., Wang, X., Zhai, X., Rong, K., Zhu, Y., Zhang, T., Yin, Z., Cao, Y., Tang, H., Franklin, A. D., Kong, J., Gong, N. Z., Ren, Z., & Wang, H. (2025). Zero‑shot autonomous microscopy for scalable and intelligent characterization of 2D materials. ACS Nano, 19(40), 35493–35502. https://doi.org/10.1021/acsnano.5c09057

More Duke Engineering AI News