System for assisted mobility using eye movements based on electrooculography

Describes an eye-control method based on electrooculography (EOG) to develop a system for assisted mobility. One of its most important features is its modularity, making it adaptable to the particular needs of each user according to the type and degree of handicap involved. An eye model based on ele...

Full description

Saved in:
Bibliographic Details
Published inIEEE transactions on neural systems and rehabilitation engineering Vol. 10; no. 4; pp. 209 - 218
Main Authors Barea, R., Boquete, L., Mazo, M., Lopez, E.
Format Journal Article
LanguageEnglish
Published United States IEEE 01.12.2002
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Describes an eye-control method based on electrooculography (EOG) to develop a system for assisted mobility. One of its most important features is its modularity, making it adaptable to the particular needs of each user according to the type and degree of handicap involved. An eye model based on electrooculographic signal is proposed and its validity is studied. Several human-machine interfaces (HMI) based on EOG are commented, focusing our study on guiding and controlling a wheelchair for disabled people, where the control is actually effected by eye movements within the socket. Different techniques and guidance strategies are then shown with comments on the advantages and disadvantages of each one. The system consists of a standard electric wheelchair with an on-board computer, sensors and a graphic user interface run by the computer. On the other hand, this eye-control method can be applied to handle graphical interfaces, where the eye is used as a mouse computer. Results obtained show that this control technique could be useful in multiple applications, such as mobility and communication aid for handicapped persons.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ObjectType-Article-2
ObjectType-Feature-1
content type line 23
ObjectType-Undefined-3
ISSN:1534-4320
1558-0210
DOI:10.1109/TNSRE.2002.806829