Topological attention asymmetry in ESM-2 attention implicitly encodes the allosteric hierarchy of the adenosine A2A receptor

Avatar
Poster
Voice is AI-generated
Connected to paperThis paper is a preprint and has not been certified by peer review

Topological attention asymmetry in ESM-2 attention implicitly encodes the allosteric hierarchy of the adenosine A2A receptor

Authors

Moya-Garcia, A. A.

Abstract

Protein language models (PLMs) learn complex structural and functional dependencies from evolutionary sequence variation alone. While these models lack a temporal axis, it remains an open question whether their static attention maps encode the directional hierarchies characteristic of allosteric communication. We investigate this in the human adenosine A2A receptor using the ESM-2 transformer. We show that attention heads tuned to functional sites exhibit elevated structural asymmetry compared to the background. Using random-site and sequence-shuffle controls, we establish that this asymmetry is not an architectural artefact of the softmax operation, but a learned, sequence-dependent signal. By defining a signed pathway score between the extracellular ligand-binding triad and the intracellular G-protein interface, we identify a robust topological polarity: the model consistently routes information from the dynamic extracellular site to the conserved intracellular interface. This directed bias emerges with network depth and is independent of intrinsic column-sink heads. Rather than simulating a forward-propagating physical wave, the network represents allosteric coupling by utilising the effector site as a structural anchor. We conclude that PLMs implicitly encode the causal hierarchy of allosteric transmission through a query-anchor topology, establishing directed attention as a robust indicator of long-range functional coordination.

Follow Us on

0 comments

Add comment