» Articles » PMID: 33441714

An Assistive Computer Vision Tool to Automatically Detect Changes in Fish Behavior in Response to Ambient Odor

Overview
Journal Sci Rep
Specialty Science
Date 2021 Jan 14
PMID 33441714
Authors
Affiliations
Soon will be listed here.
Abstract

The analysis of fish behavior in response to odor stimulation is a crucial component of the general study of cross-modal sensory integration in vertebrates. In zebrafish, the centrifugal pathway runs between the olfactory bulb and the neural retina, originating at the terminalis neuron in the olfactory bulb. Any changes in the ambient odor of a fish's environment warrant a change in visual sensitivity and can trigger mating-like behavior in males due to increased GnRH signaling in the terminalis neuron. Behavioral experiments to study this phenomenon are commonly conducted in a controlled environment where a video of the fish is recorded over time before and after the application of chemicals to the water. Given the subtleties of behavioral change, trained biologists are currently required to annotate such videos as part of a study. This process of manually analyzing the videos is time-consuming, requires multiple experts to avoid human error/bias and cannot be easily crowdsourced on the Internet. Machine learning algorithms from computer vision, on the other hand, have proven to be effective for video annotation tasks because they are fast, accurate, and, if designed properly, can be less biased than humans. In this work, we propose to automate the entire process of analyzing videos of behavior changes in zebrafish by using tools from computer vision, relying on minimal expert supervision. The overall objective of this work is to create a generalized tool to predict animal behaviors from videos using state-of-the-art deep learning models, with the dual goal of advancing understanding in biology and engineering a more robust and powerful artificial information processing system for biologists.

References
1.
Chiu C, Xian W, Moss C . Adaptive echolocation behavior in bats for the analysis of auditory scenes. J Exp Biol. 2009; 212(Pt 9):1392-404. PMC: 2726850. DOI: 10.1242/jeb.027045. View

2.
Ohyama T, Jovanic T, Denisov G, Dang T, Hoffmann D, Kerr R . High-throughput analysis of stimulus-evoked behaviors in Drosophila larva reveals multiple modality-specific escape strategies. PLoS One. 2013; 8(8):e71706. PMC: 3748116. DOI: 10.1371/journal.pone.0071706. View

3.
Risse B, Thomas S, Otto N, Lopmeier T, Valkov D, Jiang X . FIM, a novel FTIR-based imaging method for high throughput locomotion analysis. PLoS One. 2013; 8(1):e53963. PMC: 3549958. DOI: 10.1371/journal.pone.0053963. View

4.
Ballesta S, Reymond G, Pozzobon M, Duhamel J . A real-time 3D video tracking system for monitoring primate groups. J Neurosci Methods. 2014; 234:147-52. DOI: 10.1016/j.jneumeth.2014.05.022. View

5.
Hong W, Kennedy A, Burgos-Artizzu X, Zelikowsky M, Navonne S, Perona P . Automated measurement of mouse social behaviors using depth sensing, video tracking, and machine learning. Proc Natl Acad Sci U S A. 2015; 112(38):E5351-60. PMC: 4586844. DOI: 10.1073/pnas.1515982112. View