Spatial Mapping of Distributed Sensors Biomimicking the Human Vision System

Machine vision has been thoroughly studied in the past, but research thus far has lacked an engineering perspective on human vision. This paper addresses the observed and hypothetical neural behavior of the brain in relation to the visual system. In a human vision system, visual data are collected b...

Full description

Saved in:
Bibliographic Details
Published inElectronics (Basel) Vol. 10; no. 12; p. 1443
Main Authors Dutta, Sandip, Wilson, Martha
Format Journal Article
LanguageEnglish
Published Basel MDPI AG 16.06.2021
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Machine vision has been thoroughly studied in the past, but research thus far has lacked an engineering perspective on human vision. This paper addresses the observed and hypothetical neural behavior of the brain in relation to the visual system. In a human vision system, visual data are collected by photoreceptors in the eye, and these data are then transmitted to the rear of the brain for processing. There are millions of retinal photoreceptors of various types, and their signals must be unscrambled by the brain after they are carried through the optic nerves. This work is a forward step toward explaining how the photoreceptor locations and proximities are resolved by the brain. It is illustrated here that unlike in digital image sensors, there is no one-to-one sensor-to-processor identifier in the human vision system. Instead, the brain must go through an iterative learning process to identify the spatial locations of the photosensors in the retina. This involves a process called synaptic pruning, which can be simulated by a memristor-like component in a learning circuit model. The simulations and proposed mathematical models in this study provide a technique that can be extrapolated to create spatial distributions of networked sensors without a central observer or location knowledge base. Through the mapping technique, the retinal space with known configuration generates signals as scrambled data-feed to the logical space in the brain. This scrambled response is then reverse-engineered to map the logical space’s connectivity with the retinal space locations.
ISSN:2079-9292
2079-9292
DOI:10.3390/electronics10121443