A VR-Based Hybrid BCI Using SSVEP and Gesture Input

Brain-Computer Interfaces (BCIs) using Steady-State Visual Evoked Potentials have been proven to work with many different display technologies for visual stimulation. The recent advent of consumer grade Virtual Reality (VR) Head-Mounted Devices (VR-HMDs) has made research in the area of VR-based BCI...

Full description

Saved in:
Bibliographic Details
Published inAdvances in Computational Intelligence pp. 418 - 429
Main Authors Grichnik, Roland, Benda, Mihaly, Volosyak, Ivan
Format Book Chapter
LanguageEnglish
Published Cham Springer International Publishing
SeriesLecture Notes in Computer Science
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Brain-Computer Interfaces (BCIs) using Steady-State Visual Evoked Potentials have been proven to work with many different display technologies for visual stimulation. The recent advent of consumer grade Virtual Reality (VR) Head-Mounted Devices (VR-HMDs) has made research in the area of VR-based BCIs more accessible than ever - yet the possibilities of such systems still have to be tested. In this paper, we present a BCI using a well-studied 3-step spelling interface converted into a Virtual Environment (VE). The Oculus Rift CV1 VR-HMD used in this study also provides motion tracking capability, which was used to implement a novel hybrid BCI utilizing gesture input. The interface consisted of three flickering boxes on a virtual screen in the VE for typing letters. Head shake gestures were used to intuitively trigger “Delete/Back” commands. A g.tec g.USBamp amplifier was used to record and filter the signal of eight electrodes mounted in an electroencephalography cap. The Minimum Energy Combination (MEC) method was used to classify commands in real time. Eighteen participants successfully performed seven spelling tasks each, reaching an accuracy of 91.11 ± 10.26% (mean ± Standard Deviation, SD) and an Information Transfer Rate of 23.56 ± 7.54 bit/minute (mean ± SD). Questionnaires filled out before and after the experiment show that most participants enjoyed the VR BCI experience and found the gesture input very natural. Future studies could expand the input mechanism by adding more head gestures, e.g. pecking, nodding or circling to control intuitively related software tasks.
ISBN:3030205207
9783030205201
ISSN:0302-9743
1611-3349
DOI:10.1007/978-3-030-20521-8_35