Performance of a Steady-State Visual Evoked Potential and Eye Gaze Hybrid Brain-Computer Interface on Participants with and without a Brain Injury

Chris Brennan, Paul McCullagh, Gaye Lightbody, Leo Galway, Sally McClean, Piotr Stawicki, Felix Gembler, Ivan Volosyak, Elaine Armstrong, Eileen Thompson

Research output: Contribution to journalArticlepeer-review

2 Citations (Scopus)


The brain-computer interface (BCI) and the tracking of eye gaze provide modalities for human-machine communication and control. In this article, we provide the evaluation of a collaborative BCI and eye gaze approach, known as a hybrid BCI. The combined inputs interact with a virtual environment to provide actuation according to a four-way menu system. The following two approaches are evaluated: first, steady-state visual evoked potential (SSVEP) BCI with on-screen stimulation; second, hybrid BCI, which combined eye gaze and SSVEP for navigation and selection. A study comprises participants without known brain injury (non-BI, N = 30) and participants with known brain injury (BI, N = 14). A total of 29 out of 30 non-BI participants can successfully control the hybrid BCI, while nine out of the 14 BI participants are able to achieve control, as evidenced by task completion. The hybrid BCI provides a mean accuracy of 99.84% in the cohort of non-BI participants and 99.14% in the cohort of BI participants. Information transfer rates are 24.41 bpm in non-BI participants and 15.87 bpm in BI participants. The research goal is to quantify usage of SSVEP and ET approaches in cohorts of non-BI and BI participants. The hybrid is the preferred interaction modality for most participants for both cohorts. When compared to non-BI participants, it is encouraging that nine out of 14 participants with known BI can use the hBCI technology with equivalent accuracy and efficiency, albeit with slower transfer rates.

Original languageEnglish
Pages (from-to)277-286
Number of pages10
JournalIEEE Transactions on Human-Machine Systems
Issue number4
Publication statusPublished - 24 Apr 2020
Externally publishedYes

Bibliographical note

Funding Information:
The hybrid combines input modalities of SSVEP and eye gaze, using the signal processing approach reported in [17], for the BCI actuated component. In previous research (EU-funded BRAIN project, Grant Agreement Number 224156) users

Funding Information:
Manuscript received April 2, 2019; revised August 2, 2019, October 22, 2019, and January 31, 2020; accepted February 23, 2020. Date of publication April 24, 2020; date of current version July 14, 2020. This work was supported in part by the Department for Employment and Learning Northern Ireland Ph.D. Studentship and in part by Ulster University. Ethical approval was granted by the Ulster University Research Ethics Committee, REC/16/0053. This article was recommended by Associate Editor R. Chavarriaga. (Corresponding author: Paul McCullagh.) Chris Brennan, Paul McCullagh, Gaye Lightbody, Leo Galway, and Sally McClean are with the Computer Science Research Institute, Ulster University, Jordanstown BT37 0QB, U.K. (e-mail:;;;;

Publisher Copyright:
© 2013 IEEE.

Copyright 2020 Elsevier B.V., All rights reserved.


  • brain injury (BI)
  • Brain-computer interface (BCI)
  • data fusion
  • eye tracking
  • virtual environment

ASJC Scopus subject areas

  • Human Factors and Ergonomics
  • Control and Systems Engineering
  • Signal Processing
  • Human-Computer Interaction
  • Computer Science Applications
  • Computer Networks and Communications
  • Artificial Intelligence


Dive into the research topics of 'Performance of a Steady-State Visual Evoked Potential and Eye Gaze Hybrid Brain-Computer Interface on Participants with and without a Brain Injury'. Together they form a unique fingerprint.

Cite this