Unsupervised multi-animal tracking for quantitative ethology
Unsupervised multi-animal tracking for quantitative ethology
Li, Y.; Li, X.; Zhang, Q.; Zhang, Y.; Fan, J.; Lu, Z.; Li, Z.; Wu, J.; Dai, Q.
AbstractQuantitative ethology necessitates accurate tracking of animal locomotion, especially for population-level analyses involving multiple individuals. However, current methods rely on laborious annotations for supervised training and have restricted performance in challenging conditions. Here we present an unsupervised deep-learning method for multi-animal tracking (UDMT) that achieves state-of-the-art performance without requiring human annotations. By synergizing a bidirectional closed-loop tracking strategy, a spatiotemporal transformer network, and three sophisticatedly designed modules for localization refining, bidirectional ID correction, and automatic parameter tuning, UDMT can track multiple animals accurately in various challenging conditions, such as crowding, occlusion, rapid motion, low contrast, and cross-species experiments. We demonstrate the versatility of UDMT on five different kinds of model animals, including mice, rats, Drosophila, C. elegans, and Betta splendens. Combined with a head-mounted miniaturized microscope, we illustrate the power of UDMT for neuroethological interrogations to decipher the correlations between animal locomotion and neural activity. UDMT will facilitate advancements in ethology by providing a high-performance, annotation-free, and accessible tool for multi-animal tracking.