Towards a Brain-Derived Neurofeedback Framework for Unsupervised Personalisation of Brain-Computer Interfaces
Modern Brain Computer Interfaces (BCIs) use EEG signals recorded from the scalp to transduce a users intent into action. However, achieving an optimal control requires a physically and mentally demanding series of long-lasting training sessions based on the use of common neurofeedback. In this study we propose a framework that bypasses the training phase (unsupervised personalisation), where the BCI is automatically detecting whether it is acting according to the users intention or not. We used mismatch negativity (MMN), a brain response elicited every time someone is exposed to an unexpected event. However, rather than the classical auditory mismatch negativity, we found another signature of the brain, which is elicited when the brain is subjected to an action that breaks the regularity (and so an expectation) of another action previously happening -- Action Mismatch. We investigated the presence of Action Mismatch Signature (AMS) in an oddball paradigm where instead of a sound we replaced this by video sequences of a hand catching or missing a ball. Performing this experiment on 8 people, our classifier achieves 67% average detection accuracy both across and within subjects. Our AMS signature may provide a powerful tool to automatically monitor and adapt Brain-Robot-Interface and Neuroprosthetic performance.