What exactly did the Transformer learn from our physics data?

Avatar
Poster
Voice is AI-generated
Connected to paperThis paper is a preprint and has not been certified by peer review

What exactly did the Transformer learn from our physics data?

Authors

Martin Erdmann, Niklas Langner, Josina Schulte, Dominik Wirtz

Abstract

Transformer networks excel in scientific applications. We explore two scenarios in ultra-high-energy cosmic ray simulations to examine what these network architectures learn. First, we investigate the trained positional encodings in air showers which are azimuthally symmetric. Second, we visualize the attention values assigned to cosmic particles originating from a galaxy catalog. In both cases, the Transformers learn plausible, physically meaningful features.

Follow Us on

0 comments

Add comment