Extracting Gamma-Ray Information from Images with Convolutional Neural Network Methods on Simulated Cherenkov Telescope Array Data

The Cherenkov Telescope Array (CTA) will be the world’s leading ground-based gamma-ray observatory allowing us to study very high energy phenomena in the Universe. CTA will produce huge data sets, of the order of petabytes, and the challenge is to find better alternative data analysis methods to the...

Full description

Saved in:
Bibliographic Details
Published inArtificial Neural Networks in Pattern Recognition pp. 243 - 254
Main Authors Mangano, Salvatore, Delgado, Carlos, Bernardos, María Isabel, Lallena, Miguel, Rodríguez Vázquez, Juan José
Format Book Chapter
LanguageEnglish
Published Cham Springer International Publishing
SeriesLecture Notes in Computer Science
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:The Cherenkov Telescope Array (CTA) will be the world’s leading ground-based gamma-ray observatory allowing us to study very high energy phenomena in the Universe. CTA will produce huge data sets, of the order of petabytes, and the challenge is to find better alternative data analysis methods to the already existing ones. Machine learning algorithms, like deep learning techniques, give encouraging results in this direction. In particular, convolutional neural network methods on images have proven to be effective in pattern recognition and produce data representations which can achieve satisfactory predictions. We test the use of convolutional neural networks to discriminate signal from background images with high rejections factors and to provide reconstruction parameters from gamma-ray events. The networks are trained and evaluated on artificial data sets of images. The results show that neural networks trained with simulated data can be useful to extract gamma-ray information. Such networks would help us to make the best use of large quantities of real data coming in the next decades.
Bibliography:CTA website: https://www.cta-observatory.org/.
ISBN:9783319999777
331999977X
ISSN:0302-9743
1611-3349
DOI:10.1007/978-3-319-99978-4_19