When an ultraviolet experiment on NASA’s Solar Dynamics Observatory (SDO) malfunctioned, solar scientists bemoaned the loss. Without the instrument, scientists were unable to directly observe the Sun in this key wavelength.
But now, a new virtual instrument is saving the day. Using the same processes that allow your phone to translate speech to text, a team of researchers has created a super solar instrument to enable ultraviolet observations. The new instrument opens the door for past and future spacecraft to expand their capabilities.
“Deep learning is revolutionizing the way we identify patterns and connections. You can virtualize a physical instrument if you have the right data.”
Launched in 2010, SDO studies the impact of the Sun on both Earth and the near-Earth environment. But in 2014, the Multiple EUV Grating Spectrograph A (MEGS-A) component of the EUV Variability Experiment (EVE) suffered a mechanical failure that rendered it inoperable. Although the remainder of the spacecraft continued to collect data, the wavelengths that covered nearly 60% of the Sun’s radiant energy (irradiance) could no longer be directly observed.
But new technology has rendered the problem obsolete. By using computerized deep learning techniques, researchers have created a virtual instrument built from the first 4 years of data between MEGS-A and a second instrument, the Atmospheric Imaging Assembly (AIA).

“Deep learning is revolutionizing the way we identify patterns and connections,” said Andrés Muñoz-Jaramillo, a researcher at the Southwest Research Institute in Boulder, Colo., and part of the team that helped to develop the new instrument. “You can virtualize a physical instrument if you have the right data.”
“Alexa, Measure the Sun”
Part of the field of supervised machine learning, deep learning harnesses the power of a computer to make important connections between existing data sets that humans might not necessarily recognize on their own. In the past decade, the technique has spread, enabling digital assistants like Amazon’s Alexa to recognize voice commands and self-driving cars to avoid obstacles. Now researchers are applying it to science.
By comparing the first 4 years of data collected in tandem by the two physical instruments, a team of researchers led by Alexandre Szenicer of the University of Oxford essentially taught the computers how MEGS-A and AIA observations worked together. The virtual instrument then could reconstruct the missing extreme ultraviolet observations based on images collected by AIA after 2014.
“If you have a good set of observations—in this case, several years of observations made by the module that broke—then you can use deep learning to virtualize it,” Muñoz-Jaramillo said.
AIA collected images of the Sun in different colors, including ultraviolet, while MEGS-A captured the energy of the Sun in ultraviolet. The two observations are connected, with the Sun’s atmosphere and physical processes intimately related to the amount of energy it emits at different wavelengths of light.
But don’t look for virtual instruments to replace physical spacecraft. They can be built only from existing data.
“It is only possible because we had 4 years of the instrument working well before it malfunctioned,” Muñoz-Jaramillo said.
Already Contributing to Research
The new instrument has already started contributing to scientific research. One team, whose members were part of the Frontier Development Lab program used to develop the new super solar instrument, has already used it to help forecast solar disturbances in the Earth’s ionosphere and thermosphere, the top layers of the planet’s atmosphere. The creation of the virtual instrument made significant contributions to their research, which is still in progress. Measurements made by MEGS-A and its virtual clone probed the extreme UV spectrum that has been historically challenging to measure from Earth, helping to improve their predictions of space weather effects.
“It has generally been a challenge to make sure we have enough data to understand the physical processes. If we make judicious use of virtual instruments, we can fill that data gap.”
“We are the first team to even attempt [using a virtual instrument],” said Asti Bhatt, a solar scientist at SRI International in California who was not involved in developing the instrument.
Bhatt’s experience with the instrument has been a positive one; she said she would “absolutely” be willing to use another well-developed virtual instrument in the future, particularly in the challenging regions where the Sun’s activity is connected to Earth’s upper atmosphere.
“It has generally been a challenge to make sure we have enough data to understand the physical processes,” Bhatt said. “If we make judicious use of virtual instruments, we can fill that data gap.”
The new results were published in the journal Science Advances.
—Nola Taylor Redd (@NolaTRedd), Freelance Science Journalist
Citation:
Redd, N. T. (2019), Virtual super instrument enhances solar spacecraft, Eos, 100, https://doi.org/10.1029/2019EO136270. Published on 01 November 2019.
Text © 2019. The authors. CC BY-NC-ND 3.0
Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.