Brain-computer interfaces (BCIs) offer a promising avenue for restoring movement and communication to individuals with paralysis. While invasive BCIs achieve high-performance control, they require neurosurgery, posing significant risks. In contrast, non-invasive BCIs eliminate surgical risks but suffer from low neural signal fidelity, often resulting in frustratingly poor usability.
Addressing this challenge, this thesis explores the development of shared autonomy for non-invasive BCIs augmented with artificial intelligence (AI) to enhance control performance while maintaining user autonomy.
To improve signal decoding in non-invasive BCIs, I introduce a novel adaptive decoding framework that combines deep learning with shared autonomy. This approach enables more accurate neural signal interpretation, facilitating improved control in both 2D and 3D control tasks.
AI copilots that infer user intentions and provide selective assistance, a concept rooted in shared autonomy (SA), are integrated into the decoding pipeline for EEG based BCIs. Unlike traditional SA frameworks where AI intervention requires hyper-parameter tuning and can be fixed, potentially overriding user intent, this thesis introduces interventional assistance (IA) as an improved framework for shared control.
This approach dynamically determines when AI intervention is beneficial by comparing the expected action value of the human user against that of the AI copilot in a goal-agnostic fashion.
Building on these foundations, I present a diffusion-based AI copilot, termed intervention diffusion assistance (IDA), which simultaneously improves task performance and user autonomy. Through both simulations and human-in-the-loop experiments, IDA demonstrates superior performance compared to conventional SA methods.
Notably, human participants report increased autonomy and a preference for IDA, highlighting its ability to balance assistance and control.
By merging non-invasive BCI technology with intelligent and adaptive shared autonomy mechanisms, this thesis advances the feasibility of AI-assisted BCIs for clinical applications. These developments pave the way for non-invasive BCIs that offer both high-performance control and intuitive user experience, bringing the field closer to practical, real-world deployment.