i-Riter: Machine learning based novel eye tracking and calibration

This study involves the design and development of iRiter that will assist paralyzed people to write on screen only using their eye movement. It includes the precise detection of movement of eye pupil using the reflections of Near-Infrared (IR) signals from external illuminator source. IR signals are...

Full description

Saved in:
Bibliographic Details
Published in2018 IEEE International Instrumentation and Measurement Technology Conference (I2MTC) pp. 1 - 5
Main Authors Ahmad, Muhammad Bilal, Saifullah, Raja, Mujtaba Ahmed, Asif, Muhammad Waqas, Khurshid, Khurram
Format Conference Proceeding
LanguageEnglish
Published IEEE 01.05.2018
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:This study involves the design and development of iRiter that will assist paralyzed people to write on screen only using their eye movement. It includes the precise detection of movement of eye pupil using the reflections of Near-Infrared (IR) signals from external illuminator source. IR signals are synced with the refresh rate of camera and the bright/dark reflections relative to time are tracked. For calibration, the Deep Multilayer perceptron (DMLP) uses four calibration points, five virtually generated points of first hidden layer and sixteen virtually generated points of second hidden layer. The four calibration points (actual points) are the detected pupil positions. The areas that are generated from four actual, five virtually MLP generated points and sixteen DMLP generated points are the pupil areas while the multi 1 st order polynomial based transformation maps the pupil areas to the screen areas. Through the position vector of pupil area, the co-ordinates of the pupil are mapped to give position of eye on screen. There are sixteen pupil areas and corresponding sixteen multi-geometric transformations and screen areas. Data from camera was processed by a μ-controller, which passes it on to computer. openFrameworks toolkit was utilized to design graphical user interface and to display eye movement tracks.
DOI:10.1109/I2MTC.2018.8409587