Skip to main content
eScholarship
Open Access Publications from the University of California

EFMLNet: Fusion Model Based on End-to-End Mutual Information Learning for Hybrid EEG-fNIRS Brain-Computer Interface Applications

Creative Commons 'BY' version 4.0 license
Abstract

Electroencephalography (EEG) and functional near infrared spectroscopy (fNIRS), both portable and non-invasive, enhance brain-computer interface (BCI) performance by integrating their spatial and temporal benefits when combined together. However, the fusion of these two signals still faces challenges. To fully unitize the complementarity of EEG and fNIRS for improved performance in EEG-fNIRS BCI, we propose an EEG-fNIRS fusion network based on end-to-end mutual information learning, named EFMLNet. In the model, EEG and fNIRS data are fed into their respective feature extractors for the extraction of temporal and spatial information. Furthermore, their complementary information is fused by two parallel mutual learning modules. We conducted classification experiments on a publicly available BCI dataset based on motor imagery (MI) task and achieved a cross-subject classification accuracy of 71.52%. This result surpasses the performance of most existing fusion methods and demonstrates the potential for real-time hybrid BCI systems.

Main Content
For improved accessibility of PDF content, download the file to your device.
Current View