A shared neural code for gender across faces, bodies, and objects in the human brain

Avatar
Poster
Voice is AI-generated
Connected to paperThis paper is a preprint and has not been certified by peer review

A shared neural code for gender across faces, bodies, and objects in the human brain

Authors

Liu, W.; Lu, X.; Cheng, Y.; Wang, R.; Lu, X.; Yuan, X.; Jiang, Y.

Abstract

Humans can readily infer gender from diverse visual categories, including faces, bodies, and objects, yet it remains unclear whether the brain constructs a shared, category-general representation of gender. To fill the gap, we measured neural responses using fMRI while participants viewed female and male stimuli across these three categories. Multivoxel pattern analyses revealed that gender information was encoded throughout distributed occipitotemporal regions for all categories. Critically, cross-category decoding and regression-based representational similarity analyses converged on a region of the right middle temporal gyrus (rMTG) that encoded gender independently of stimulus category. This region also carried category information, indicating mixed selectivity. Comparisons between neural representational dissimilarity matrices and those from a fine-tuned convolutional neural network (CNN) showed that rMTG activity most closely resembled intermediate CNN layers, suggesting that shared gender representations rely on mid-level visual features. Finally, functional connectivity analyses revealed that gender-related interactions among occipitotemporal regions were similar for faces and bodies but not for objects. Together, these findings identify a category-general neural hub for gender perception and illuminate how mid-level visual features and distributed networks support the abstraction of gender across heterogeneous visual inputs.

Follow Us on

0 comments

Add comment