James Webb, a NASA space telescope, is exposing the universe with breathtaking, unmatched clarity. The observatory’s ultra-clear infrared vision has pierced the cosmos’ dust to reveal some of the universe’s earliest structures as well as whirling galaxies and previously hidden star nurseries that are hundreds of millions of light-years away.
Webb will not only look further into the universe than ever before, but he will also catch the most comprehensive view of objects in our own galaxy, specifically some of the 5,000 planets identified in the Milky Way. Astronomers are using the telescope’s precision in light-parsing to interpret the atmospheres of some of these close worlds. The features of their atmospheres could reveal how a planet arose and whether it is home to live.
However, a new MIT study reveals that the techniques astronomers generally use to decipher light-based signals may not be accurate enough to analyze the data from the new telescope. To match the accuracy of Webb’s data, the researchers believe, that opacity models—the instruments that describe how light interacts with matter as a function of the matter’s properties—may need massive retuning.
Unless these models are improved? The researchers think that the temperature, pressure, and elemental make-up of planetary atmospheres, among other things, could be off by an order of magnitude.
“There is a scientifically significant difference,” says study co-leader Julien de Wit, “between a compound like water being present at 5 percent versus 25 percent, which current models cannot differentiate.
The current model being used “to decrypt spectral information is not up to par with the precision and quality of data we have from the James Webb telescope,” according to EAPS graduate student Prajwal Niraula.
Niraula adds: “We need to up our game and tackle together the opacity problem.”
In Nature Astronomy, De Wit, Niraula, and their colleagues have published their research. Iouli Gordon, a spectroscopy expert from the Harvard-Smithsonian Center for Astrophysics, Robert Hargreaves, Clara Sousa-Silva, and Roman Kochanov are co-authors.
The opacity of a material is a description of how well it blocks out light. Depending on whether and how they interact with certain molecules within a substance, photons of a certain wavelength can either travel through a material directly, be absorbed, or be reflected back out. The temperature and pressure of the material also affect this interaction.
An opacity model is based on different ideas about how light and matter interact. Given the range of light that a substance emits, astronomers can determine some of its qualities using opacity models. In the case of extrasolar planets, an opacity model, based on the light from the planet that a telescope records, can decode the kind and amount of molecules in a planet’s atmosphere.
According to De Wit, the most advanced opacity model, which he compares to a tool for translating a classical language, has done a respectable job of deciphering spectral data obtained by sensors like those on the Hubble Space Telescope.
“So far, this Rosetta Stone has been doing OK,” de Wit adds. “But now that we’re going to the next level with Webb’s precision, our translation process will prevent us from catching important subtleties, such as those making the difference between a planet being habitable or not.”
This is the argument made by him and his colleagues in their study, which tests the most popular opacity model. The team investigated what characteristics of the atmosphere the model would produce under various assumptions about the limits of our knowledge of the interactions between light and matter. Eight of these “perturbed” models were developed by the researchers. In order to recreate the precise light patterns that the James Webb telescope will observe, the crew created “synthetic spectra” and fed them to each model, including the real version.
They found that different models made different predictions about the properties of a planet’s atmosphere based on the same light spectra. The team’s analysis leads them to the conclusion that they will encounter an “accuracy wall” when attempting to use existing opacity models to analyze light spectra obtained by the Webb telescope. In other words, they won’t be sensitive enough to distinguish between a planet’s atmosphere having a 300-Kelvin or 600-Kelvin temperature or whether a certain gas makes up 5% or 25% of an atmosphere.
This distinction, according to Niraula, “matters for us to constrain planetary formation mechanisms and reliably identify biosignatures.”
The group also discovered that each model produced a “good fit” with the data, which means that even though a perturbed model produced a chemical composition that the researchers knew to be false, it also generated a light spectrum from that chemical composition that “fit” with the original spectrum.
They “found that there are enough parameters to tweak, even with a wrong model, to still get a good fit,” explains de Wit, “meaning you wouldn’t know that your model is wrong and what it’s telling you is wrong.”
He and his coworkers put forth some suggestions for how to enhance current opacity models, including the need for more theoretical calculations and laboratory measurements to improve the models’ presumptions about how light interacts with various molecules, as well as cross-disciplinary collaborations, particularly between astronomy and spectroscopy.
“There is so much that could be done if we knew perfectly how light and matter interact,” adds Niraula. “We know that well enough around the Earth’s conditions, but as soon as we move to different types of atmospheres, things change, and that’s a lot of data, with increasing quality, that we risk misinterpreting.”
Image Credit: Getty