Mapping the Invisible

4/1/24 DukEngineer Magazine

Duke researchers are gearing up to take advantage of the new Nancy Grace Roman Space Telescope to explore the history (and present) phenomena of our galaxy

The assembly wheel, designed to reflect light extremely efficiently.
Mapping the Invisible

It’s a rite of passage for Pratt undergraduates — physics. Known for its challenging weekly quizzes and grueling labs, it’s no surprise the two-semester Introductory Physics sequence (Physics 151L + 152L) often results in a collective sigh of relief post-final exam. The students leave with a false sense of confidence that they’ve mastered the material.
In reality, they’ve covered less than 5% of the subject. This is because the content learned in classes like Introductory Physics or Optics and Modern Physics (Physics 264L) cover the physics of normal matter. However, according to NASA, normal matter composes less than 5% of the universe in mass, while the other 95% is made up of dark components, called dark energy and dark matter.

More about dark energy and dark matter is unknown than known. Dark energy appears to the repulsive energy of empty space, but theoreticians have a hard time matching it to predictions. Dark matter is more like normal matter, but while normal matter interacts electromagnetically with photon particles and can absorb, reflect, or emit light, dark matter simply doesn’t, making it difficult to locate. In fact, scientists only theorized the existence of dark matter because they observed physical anomalies — like spiral galaxies rotating faster than thought possible given their calculable masses or photons bending around, seemingly, nothing.

Michael Troxel

One Duke faculty member hoping to uncover the mysteries of the dark universe is Michael Troxel, a professor in the Department of Physics and a member of the Fitzpatrick Institute for Photonics (FIP), an interdisciplinary center aiming to advance photonics and optical sciences. Troxel’s research focuses on a technique called “gravitational lensing” to map the locations and sizes of dark matter masses. Gravitational lensing revolves around the idea that the path of light can be bent by the gravitational pull of an object of large enough mass. Thus, the size and location of massive objects like galaxies and dark matter can be determined by observing how the light path bends around them. The location of dark matter can then be isolated by identifying the locations where no visible mass is present.

Daniel Scolnic

Another Duke Physics professor and FIP member shining light on dark matter is Daniel Scolnic, whose research hinges on the concept of “cosmic expansion,” which relies on the light generated by exploding stars — supernovae — to chart the expansion of the universe because of dark energy. The light emitted from one supernova is roughly equal to that of half a billion stars, so the location of a supernova can be pinpointed by identifying areas where the light captured is considerably greater than it was previously. By measuring the apparent brightness, scientists can determine the distance of the supernovae from Earth and use it to understand how dark energy is causing the universe to accelerate in its expansion.

In 2016, to more deeply understand dark energy, dark matter, and other celestial mysteries, NASA approved the development of the Nancy Grace Roman Space Telescope. Tuan Vo-Dinh, a professor of biomedical engineering and chemistry who directs the FIP, says that Roman’s mission is unique.

Tuan Vo-Dinh of Duke University

These two intrepid scientists and members of the Fitzpatrick Institute of Photonics are preparing to embark on a journey with the future NASA Nancy Grace Roman Space Telescope as their explorer vessel, charting the celestial map to unveil the secrets of planetary systems and illuminate the enigmatic realms of dark energy, exoplanets, and infrared astrophysics

Tuan Vo-Dinh Director of FIP, Professor of Biomedical Engineering and Chemistry

The Roman Telescope is considered a successor to the famous Hubble Space Telescope, while sharing some characteristics with the more modern James Webb Space Telescope (JWST). Its main similarity with Hubble is its sensitivity — the extent to which it can determine faint sources of light. While this may seem like devolution considering the exceptional sensitivity of the JWST, the purpose of Roman is to survey large plots of the night sky, not hone into one celestial system.

“The data that Roman will take in its survey mission would take Hubble or the James Webb Space Telescope 1000 years to gather,” said Troxel.

For both Troxel and Scolnic, capturing a wide field of view is critically important. In gravitational lensing, for instance, it’s crucial to capture as much light as possible; the bending of one path of light does not mean that dark matter is present; rather, it’s the bending of many paths of light in a single area that indicates the presence of dark matter. In addition, for the cosmic expansion method to be effective, finding as many supernovae as possible is a priority. Supernovae, however, are few and far between. “We estimate supernovae at the rate of one per galaxy per century,” said Troxel.

Thankfully, Roman’s imaging cameras can capture a field of view 100 times larger than those of the Hubble, allowing the efficient mapping of dark matter.

Roman's Focal Point System, responsible for taking photos of the universe.

Troxel and Scolnic have been working with NASA on their respective research for years. Even before coming to Duke, both worked as a part of the Roman’s Science Investigation Team (SIT), which performed initial research to determine the space telescope’s design and instruments on board. When the SIT contract ended after the telescope entered “Phase C” (NASA jargon for telescope construction), both professors were chosen to implement their respective research in the Project Investigation Team, which is building the infrastructure to make the gathered data usable, supported by $12.5 million in grants over five years.

While both methods are reliable in measuring the components of the universe, they do rely heavily on observing the behavior of light. Due to atmospheric factors like wind turbulence and draft that distort light, on-Earth observatories cannot be used to gather this data. However, research on the ground may soon become more viable with the help of Duke Data+, a summer research program that aims to efficiently synthesize big data to tackle interdisciplinary challenges. Notably, one recent project (called “Finding Space Junk”) involved students beginning development of a machine-learning algorithm to eliminate extraneous sources of light for the development of the Roman Space Telescope.

“The goal of the ‘Finding Space Junk’ project was identifying transient artifacts in images of deep space,” said Kevin Liang, who earned a PhD in electrical and computer engineering at Duke and mentored the Data+ project before joining Meta’s Fundamental AI Research team as a research scientist.

“Things like cosmic rays, satellites, or asteroids are not what we might normally consider ‘junk,’ but they can result in bright streaks or specks in the long exposure images of the night sky typically used by cosmologists in understanding the universe…”

Kevin Liang, ECE PhD’20

“Things like cosmic rays, satellites, or asteroids are not what we might normally consider ‘junk,’ but they can result in bright streaks or specks in the long exposure images of the night sky typically used by cosmologists in understanding the universe, which can throw off measurements or calculations of studies on this data. In the past, these artifacts were largely identified by volunteers manually, but the process can be a slow and noisy. With the Dark Energy Survey ramping up its image production, it was unclear if this manual annotation effort would scale sufficiently, potentially posing a significant blocker to the Dark Energy Survey as a whole.”

The Data+ project attempted to tackle the problem with a machine learning approach, exploring how well such methods would be able to help cosmologists find artifacts faster, in an automated fashion.

Roman's antennae and satellite dish

Deep learning methods for computer vision often focus on RGB real-world, object-centric imagery, however, while space imagery has a dramatically different appearance. “The images are larger, the blackness of space dominates, and the specks and streaks we sought to detect were much thinner or smaller than the typical computer vision dataset targets,” said Liang. “It wasn’t immediately clear as we were starting how well such methods might generalize to this setting.”

Because the imagery hadn’t really been used as an machine learning dataset in the past, the Data+ team members (Pavani Jairam, Rebecca Bell, Jiayue Xu) had to invest a significant amount of engineering effort organizing and formatting the data and loading it into the model for training and inference. Collaboration with cosmology experts —Duke Professor of Physics Christopher Walters and physics postdoc and astronomer Bruno Sanchez, in addition to Troxel and Scolnic — made it possible for the team to explore human-in-the-loop active learning approaches as well, identifying a number of errors from the previous human volunteer annotations and providing corrections. “The resulting models showed high promise, with strong enough recall and precision to be considered for packaging as a tool for the Dark Energy Survey,” said Liang.

The telescope is set to launch in 2027, and rest assured the Duke and space communities will be watching with anticipation to see what Troxel, Scolnic, and the whole Roman team can achieve. “Their pioneering research and interdisciplinary, trail-blazing spirit will reshape our understanding of the universe in the decades to come,” said Vo-Dinh.

2024 DukEngineer Magazine