Self-Supervised Optical Flow with Spiking Neural Networks and Event Based Cameras

Optical flow can be leveraged in robotic systems for obstacle detection where low latency solutions are critical in highly dynamic settings. While event-based cameras have changed the dominant paradigm of sending by encoding stimuli into spike trails, offering low bandwidth and latency, events are s...

Full description

Saved in:
Bibliographic Details
Published in2021 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) pp. 5892 - 5899
Main Authors Chaney, Kenneth, Panagopoulou, Artemis, Lee, Chankyu, Roy, Kaushik, Daniilidis, Kostas
Format Conference Proceeding
LanguageEnglish
Published IEEE 27.09.2021
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Optical flow can be leveraged in robotic systems for obstacle detection where low latency solutions are critical in highly dynamic settings. While event-based cameras have changed the dominant paradigm of sending by encoding stimuli into spike trails, offering low bandwidth and latency, events are still processed with traditional convolutional networks in GPUs defeating, thus, the promise of efficient low capacity low power processing that inspired the design of event sensors. In this work, we introduce a shallow spiking neural network for the computation of optical flow consisting of Leaky Integrate and Fire neurons.Optical flow is predicted as the synthesis of motion orientation selective channels. Learning is accomplished by Back-propapagation Through Time. We present promising results on events recorded in real "in the wild" scenes that has the capability to use only a small fraction of the energy consumed in CNNs deployed on GPUs.
ISSN:2153-0866
DOI:10.1109/IROS51168.2021.9635975