A Saccade Based Framework for Real-Time Motion Segmentation Using Event Based Vision Sensors
Overview
Affiliations
Motion segmentation is a critical pre-processing step for autonomous robotic systems to facilitate tracking of moving objects in cluttered environments. Event based sensors are low power analog devices that represent a scene by means of asynchronous information updates of only the dynamic details at high temporal resolution and, hence, require significantly less calculations. However, motion segmentation using spatiotemporal data is a challenging task due to data asynchrony. Prior approaches for object tracking using neuromorphic sensors perform well while the sensor is static or a known model of the object to be followed is available. To address these limitations, in this paper we develop a technique for generalized motion segmentation based on spatial statistics across time frames. First, we create micromotion on the platform to facilitate the separation of static and dynamic elements of a scene, inspired by human saccadic eye movements. Second, we introduce the concept of as a methodology to partition spatio-temporal event groups, which facilitates computation of scene statistics and characterize objects in it. Experimental results show that our algorithm is able to classify dynamic objects with a moving camera with maximum accuracy of 92%.
A Spatial-Motion-Segmentation Algorithm by Fusing EDPA and Motion Compensation.
Liu X, Zhao Y, Yang L, Ge S Sensors (Basel). 2022; 22(18).
PMID: 36146090 PMC: 9502573. DOI: 10.3390/s22186732.
Approaching Retinal Ganglion Cell Modeling and FPGA Implementation for Robotics.
Linares-Barranco A, Liu H, Rios-Navarro A, Gomez-Rodriguez F, Moeys D, Delbruck T Entropy (Basel). 2020; 20(6).
PMID: 33265565 PMC: 7512993. DOI: 10.3390/e20060475.