Analyzing AIA flare observations using convolutional neural networks

Research output: Contribution to journalArticlepeer-review

4 Citations (Scopus)
2 Downloads (Pure)


In order to efficiently analyse the vast amount of data generated by solar space missions and ground-based instruments, modern machine learning techniques such as decision trees, support vector machines (SVMs) and neural networks can be very useful. In this paper we present initial results from using a convolutional neural network (CNN) to analyse observations from the Atmospheric Imaging Assembly (AIA) in the 1,600Å wavelength. The data is pre-processed to locate flaring regions where flare ribbons are visible in the observations. The CNN is created and trained to automatically analyse the shape and position of the flare ribbons, by identifying whether each image belongs into one of four classes: two-ribbon flare, compact/circular ribbon flare, limb flare, or quiet Sun, with the final class acting as a control for any data included in the training or test sets where flaring regions are not present. The network created can classify flare ribbon observations into any of the four classes with a final accuracy of 94%. Initial results show that most of the images are correctly classified with the compact flare class being the only class where accuracy drops below 90% and some observations are wrongly classified as belonging to the limb class.
Original languageEnglish
Article number34
Number of pages8
JournalFrontiers in Astronomy and Space Sciences
Publication statusPublished - 26 Jun 2020


  • Convolutional neural network
  • Solar flares
  • Flare ribbons
  • Machine learning
  • Classification
  • Helio19


Dive into the research topics of 'Analyzing AIA flare observations using convolutional neural networks'. Together they form a unique fingerprint.

Cite this