Nov 2025 – Jun 2026
NeuroMotion
In this brain-computer interface project I developed, I detected users' movement intentions (like 'forward' or 'left') using brain signals from dry-electrode EEG systems. While processing and filtering signals with MNE and Matlab, I developed deep learning models based on TensorFlow and PyTorch to generate 2D commands from raw EEG data. The system also has the ability to detect and correct its erroneous predictions thanks to brain-based feedback (ErrP). The project was completed with a user-friendly interface that allows motor-impaired individuals to control a simulated wheelchair or robotic arm.
Overview
In this brain-computer interface project I developed, I detected users' movement intentions (like 'forward' or 'left') using brain signals from dry-electrode EEG systems. While processing and filtering signals with MNE and Matlab, I developed deep learning models based on TensorFlow and PyTorch to generate 2D commands from raw EEG data. The system also has the ability to detect and correct its erroneous predictions thanks to brain-based feedback (ErrP). The project was completed with a user-friendly interface that allows motor-impaired individuals to control a simulated wheelchair or robotic arm.
Problem
Motor-impaired users need control interfaces that can interpret movement intention without depending on conventional hand input, while EEG-based BCI research often suffers from limited reproducibility and inaccessible hardware assumptions.
Technical Approach
I designed an EEG processing pipeline around dry-electrode signals, MNE, Matlab, TensorFlow, and PyTorch. The system focuses on movement-intention classification, 2D command generation, and error-related potential feedback for correcting wrong predictions.
Result
The project connects research-oriented BCI modeling with a practical control interface for simulated wheelchair or robotic-arm interaction, giving the work a clearer assistive-technology direction.