Audio-cued motor imagery-based brain–compute rinterface: Navigation through virtual and real environments

Loading...
Thumbnail Image

Identifiers

Publication date

Reading date

Collaborators

Advisors

Tutors

Editors

Journal Title

Journal ISSN

Volume Title

Publisher

Elseiver

Metrics

Google Scholar

Share

Research Projects

Organizational Units

Journal Issue

Department/Institute

Keywords

Abstract

he aim of this work is to provide a navigation paradigm that could be used to control a wheelchair through a brain-computer interface (BCI). In such a case, it is desirable to control the system without a graphical interface so that it will be useful for people without gaze control. Thus, an audio-cued paradigm with several navigation commands is proposed. In order to reduce the probability of misclassification, the BCI operates with only two mental tasks: relaxed state versus imagination of right hand movements; the use of motor imagery for navigation control is not yet extended among the auditory BCIs. Two experiments are described: in the first one, users practice the switch from a graphical to an audio-cued interface with a virtual wheelchair; in the second one, they change from virtual to real environments. The obtained results support the use of the proposed interface to control a real wheelchair without the need of a screen to provide visual stimuli or feedback

Description

Bibliographic citation

Collections

Endorsement

Review

Supplemented By

Referenced by