NASA scientists are using artificial intelligence to calibrate photographs of the Sun to improve data for solar studies. NASA’s Solar Dynamics Observatory (SDO) has been providing high-definition photos of the Sun for nearly a decade since its launch on February 11, 2010. The photos have offered an in-depth examination of a variety of solar phenomena. SDO’s Atmospheric Imaging Assembly (AIA) observes the Sun continuously and generates a lot of data about our Sun that has never been possible before. AIA degrades over time as a result of continual looking, and the data must be calibrated frequently. Now, scientists are using artificial intelligence to calibrate some of NASA’s solar photos. This will enhance the data available to scientists conducting solar research.
Challenges in Existing Approach
The researchers are attempting to maintain continuous virtual calibration between sounding rockets by first training a machine learning algorithm using AIA data to recognise and compare solar features, then giving it similar images to see if it correctly identifies the required calibration. There are some disadvantages to the sounding rocket calibration approach. While sounding rockets have a limited number of launches, AIA is constantly scanning the Sun. This means that between each sounding rocket calibration, there will be downtime during which the calibration will be somewhat wrong.
How Researchers Overcome it
With these issues in mind, scientists chose to investigate how machine learning could be used to calibrate the equipment in the future, with the goal of maintaining continuous calibration. Researchers had to first build a machine learning algorithm to distinguish solar structures and compare them using AIA data. They do this by feeding the algorithm photos from sounding rocket calibration flights and telling it how much calibration they require. They offer the algorithm – identical photos after enough of these samples to determine if it can detect the correct calibration needed. After collecting enough data, the system learns how much calibration is required for each image. Researchers can use AIA to evaluate certain structures across wavelengths and reinforce its judgments because it looks at the Sun in different wavelengths of light.
Machine Learning Method
The objective is to develop a novel method based on machine learning that takes advantage of spatial patterns on the solar surface across several wavelength observations to auto-calibrate equipment deterioration. The researchers used the SDOML dataset to train two convolutional neural network (CNN) designs that handle either single-channel or multichannel input. The dataset was supplemented by randomly deteriorating photos at each epoch, with the training and test datasets covering months that did not overlap. They also created a non-ML baseline model to evaluate the CNN models‘ gains. The researchers recreated the AIA multichannel deterioration curves from 2010 to 2020 using the best-trained models and compared them to decline curves built on sounding-rocket data.
Results
The results reveal that CNN-based models beat the non-ML baseline model by a wide margin when calibrating instrument degradation. Additionally, multichannel CNN beats single-channel, implying that cross-channel connections between distinct EUV channels are critical for recovering deterioration profiles. The CNN-based models accurately recreate the degradation adjustments generated from sounding-rocket cross-calibration measurements within the experimental measurement uncertainty, demonstrating that they perform as well as existing approaches.
Conclusion
The machine learning approach lays the groundwork for a novel technique for calibrating extreme-UV devices based on CNNs. The researchers foresee adapting this technology to additional imaging or spectral sensors that operate at different wavelengths. Additionally, scientists can use the technique to compare certain structures across wavelengths and hence enhance evaluations. Once the computer is able to recognise a solar flare without degradation, it can determine the extent to which AIA’s current images are degraded and how much calibration each image requires. As machine learning progresses, its scientific applications will broaden to include an increasing number of missions.
Dataset: SDOML dataset
For further information, refer to the article.