» Articles » PMID: 38768230

Versatile Multiple Object Tracking in Sparse 2D/3D Videos Via Deformable Image Registration

Overview
Specialty Biology
Date 2024 May 20
PMID 38768230
Authors
Affiliations
Soon will be listed here.
Abstract

Tracking body parts in behaving animals, extracting fluorescence signals from cells embedded in deforming tissue, and analyzing cell migration patterns during development all require tracking objects with partially correlated motion. As dataset sizes increase, manual tracking of objects becomes prohibitively inefficient and slow, necessitating automated and semi-automated computational tools. Unfortunately, existing methods for multiple object tracking (MOT) are either developed for specific datasets and hence do not generalize well to other datasets, or require large amounts of training data that are not readily available. This is further exacerbated when tracking fluorescent sources in moving and deforming tissues, where the lack of unique features and sparsely populated images create a challenging environment, especially for modern deep learning techniques. By leveraging technology recently developed for spatial transformer networks, we propose ZephIR, an image registration framework for semi-supervised MOT in 2D and 3D videos. ZephIR can generalize to a wide range of biological systems by incorporating adjustable parameters that encode spatial (sparsity, texture, rigidity) and temporal priors of a given data class. We demonstrate the accuracy and versatility of our approach in a variety of applications, including tracking the body parts of a behaving mouse and neurons in the brain of a freely moving C. elegans. We provide an open-source package along with a web-based graphical user interface that allows users to provide small numbers of annotations to interactively improve tracking results.

References
1.
Tinevez J, Perry N, Schindelin J, Hoopes G, Reynolds G, Laplantine E . TrackMate: An open and extensible platform for single-particle tracking. Methods. 2016; 115:80-90. DOI: 10.1016/j.ymeth.2016.09.016. View

2.
Leifer A, Fang-Yen C, Gershow M, Alkema M, Samuel A . Optogenetic manipulation of neural activity in freely moving Caenorhabditis elegans. Nat Methods. 2011; 8(2):147-52. PMC: 3032981. DOI: 10.1038/nmeth.1554. View

3.
Nguyen J, Linder A, Plummer G, Shaevitz J, Leifer A . Automatically tracking neurons in a moving and deforming brain. PLoS Comput Biol. 2017; 13(5):e1005517. PMC: 5436637. DOI: 10.1371/journal.pcbi.1005517. View

4.
Moen E, Bannon D, Kudo T, Graf W, Covert M, Van Valen D . Deep learning for cellular image analysis. Nat Methods. 2019; 16(12):1233-1246. PMC: 8759575. DOI: 10.1038/s41592-019-0403-1. View

5.
Hallinen K, Dempsey R, Scholz M, Yu X, Linder A, Randi F . Decoding locomotion from population neural activity in moving . Elife. 2021; 10. PMC: 8439659. DOI: 10.7554/eLife.66135. View