Community /

Building a Grasp-Assist Neuroprosthetic with [email protected]

[email protected] aims to use OpenBCI hardware to create a neuroprosthetic glove to assist in grasping/holding objects for those with limited finger strength and dexterity. More specifically, by placing EEG electrodes on the motor cortex of the subject, we will be able to use machine learning models to decode when the user intends to grasp a desired object and effectively translate that signal to a microcontroller to control the prosthetic glove.

How are OpenBCI tools being applied?

The brain-computer interface will be made by acquiring EEG signals using the OpenBCI EEG headband kit as well as the Cyton board. Electrodes will be placed above the motor cortex to record motor imagery, and the participant will be asked to envision opening and closing their fist.

A binary classifier will be trained to distinguish between the two mental EEG commands, and the signals will be epoched and processed to extract them as features.

Who is involved in this project?

For this project, [email protected] is partnering with Adaptive Tech, a UC Davis organization that makes mechanical prosthetics for clients in the local Davis area. Adaptive Tech currently has a prototype for the mechanical prosthetic and is taking care of the hardware side of the prosthetic, while the team from [email protected] will be focused on decoding EEG signals from the brain and sending the signals to the Arduino to control the prosthetic device.

Past Projects from [email protected]

[email protected] has completed BCI projects in the past, including building a BCI that helps those with with speech and motor impediments communicate effectively using only eye movements. EEG data was streamed from the Muse 2 headband and a classifier was trained on eye movements to trigger a red light bulb, green light bulb, or a buzzer on the raspberry pi. A lateral side-to-side motion of the eyes analogous to shaking your head for saying “no” triggered the red light bulb. To indicate yes, up and down eye movements analogous to nodding your head triggered a green light bulb. To call for assistance if no one was present in the room, the user simply needed to blink rapidly. Check out our demo video here.

Leave a Reply