Drosophila-Vision-Inspired Motion Perception Model and Its Application in Saliency Detection
Vision in Drosophila has been the subject of extensive behavioral, physiological, and anatomical studies. However, our understanding of its underlying neural computations remains incomplete due to the gap in computational biology. Drosophila vision has been proven to be considerably more sensitive i...
Saved in:
Published in | IEEE transactions on consumer electronics Vol. 70; no. 1; pp. 819 - 830 |
---|---|
Main Authors | , , , |
Format | Journal Article |
Language | English |
Published |
New York
IEEE
01.02.2024
The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Vision in Drosophila has been the subject of extensive behavioral, physiological, and anatomical studies. However, our understanding of its underlying neural computations remains incomplete due to the gap in computational biology. Drosophila vision has been proven to be considerably more sensitive in response to object motion, approaching approximately 10 times the speed of humans. Hence, modeling Drosophila vision is desired for advancing computer vision for consumer electronics. Applying the Drosophila vision model may achieve an optimal tradeoff between accuracy and efficiency in vision tasks. This study proposes a Drosophila-vision-inspired motion perception (DVMP) model that integrates successive computational layers from the superficial retina with the central complex. This bio-inspired model can efficiently extract motion saliency in dynamic scenes. Ablation studies and the final evaluation results of our DVMP model provide an intuitive paradigm for gaining better insight into the neural mechanisms involved in Drosophila vision. Also, extensive experimental comparisons using both data-independent and learning-based saliency detection methods demonstrate the potential performance and speed of our DVMP model, implying that it can be easily applied in consumer electronics, e.g., mobile phones and robots. |
---|---|
Bibliography: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 |
ISSN: | 0098-3063 1558-4127 |
DOI: | 10.1109/TCE.2024.3355512 |