Unsupervised multi-animal tracking for quantitative ethology

Avatar
Poster
Voice is AI-generated
Connected to paperThis paper is a preprint and has not been certified by peer review

Unsupervised multi-animal tracking for quantitative ethology

Authors

Li, Y.; Li, X.; Zhang, Q.; Zhang, Y.; Fan, J.; Lu, Z.; Li, Z.; Wu, J.; Dai, Q.

Abstract

Quantitative ethology necessitates accurate tracking of animal locomotion, especially for population-level analyses involving multiple individuals. However, current methods rely on laborious annotations for supervised training and have restricted performance in challenging conditions. Here we present an unsupervised deep-learning method for multi-animal tracking (UDMT) that achieves state-of-the-art performance without requiring human annotations. By synergizing a bidirectional closed-loop tracking strategy, a spatiotemporal transformer network, and three sophisticatedly designed modules for localization refining, bidirectional ID correction, and automatic parameter tuning, UDMT can track multiple animals accurately in various challenging conditions, such as crowding, occlusion, rapid motion, low contrast, and cross-species experiments. We demonstrate the versatility of UDMT on five different kinds of model animals, including mice, rats, Drosophila, C. elegans, and Betta splendens. Combined with a head-mounted miniaturized microscope, we illustrate the power of UDMT for neuroethological interrogations to decipher the correlations between animal locomotion and neural activity. UDMT will facilitate advancements in ethology by providing a high-performance, annotation-free, and accessible tool for multi-animal tracking.

Follow Us on

0 comments

Add comment