Joint Measurement and Trajectory Recovery in Visible Light Communication
The growing preponderance of light emitting diodes (LEDs) for lighting has motivated research into their dual use for visible light communication (VLC) and navigation. VLC extracts a bit sequence from a series of photodetector scans. Among this data is an LED ID that ensures the reliable data associ...
Saved in:
Published in | IEEE transactions on control systems technology Vol. 25; no. 1; pp. 247 - 261 |
---|---|
Main Authors | , , |
Format | Journal Article |
Language | English |
Published |
New York
IEEE
01.01.2017
The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | The growing preponderance of light emitting diodes (LEDs) for lighting has motivated research into their dual use for visible light communication (VLC) and navigation. VLC extracts a bit sequence from a series of photodetector scans. Among this data is an LED ID that ensures the reliable data association in navigation and data communication. Recovering the LED data and ID requires the accurate prediction of each LED's projected position on the photodetector array to extract efficiently and reliably the LED ON-OFF status in each photodetector scan. Estimating the LED projected position is challenging because: 1) clutter and noise corrupt the measurements; 2) the LED status will be OFF in some scans; and 3) the predicted projection location sequence depends on the estimated rover state trajectory, which is uncertain and time varying. This paper presents a method to determine the q-most probable data and the LED position sequences simultaneously for a time window of data, using Bayesian multiple hypothesis tracking techniques by maximizing the posterior probabilities. This paper focuses on the VLC data and the LED position sequence extraction, which includes rover state estimation. Implementation of the multiple hypothesis tracking algorithm is illustrated by postprocessed experimental results. |
---|---|
ISSN: | 1063-6536 1558-0865 |
DOI: | 10.1109/TCST.2016.2554062 |