Damaging you spinal cord or injuring you arms which will result in amputations of the limbs will change your life forever. It will cause you to seek the support of a care person when ever you want to perform tasks like eating, bathing, etc. This project was aimed at addressing the problem of self feeding of the injured person. Even though this has many further steps to complete before becoming a complete product, we believe this step will initiate the path in the required direction.
We used Steady State Visually Evoked Potential (SSVEP) method for generating the control commands. In this method user have to look at a light source which is flickering at a certain frequency. When the user concentrate on this light, his/her brain will generate a same frequency brain wave and when we process the EEG data we can identify where the person is looking at.
There are three food items that the user can select and there are 3 LED matrices blinking at 6Hz, 7Hz, and 8Hz frequencies, associated with each food bowl. User need to look at the LED matrix associated with his/her’s desired food item to activate the pre-programmed feeding path.
We used three 8*8 5mm red LED matrices along with a arduino mega to produce visual stimulations.
Feeder arm is a servo based robot arm with a end effector as a spoon. I will add an another post in future to my blog regarding what went together to make the feeder robot.
Data Acquisition and Processing
Data acquisition was done using the OpenBCI 32bit v3 board and we used 8 channels to gather the data. As a true opensource product, OpenBCI provided the complete freedom to use it at our will. For this project we used the OpenBCI python SDK together with LSL to stream the signals to MATLAB for further processing.
Data gathering was done using 10 locations including the two ear lobes which had the ground and reference electrodes placed.
Matlab was used to process the signals and high pass filter was applied at the initial stage for filtering low frequency noise. To identify the the frequencies it is necessary to convert the time domain signals to frequency domain signals. Fast Fourier transformation was used to achieve this and threshold value was used to classify the outputs.
After developing the system we tested out it with five subjects and they were able to control the arm with different accuracy levels. Further research is needed to increase the accuracy.
Finally this is a video of me controlling the robot. Video is fast forward in few occasions since sometimes it take about 30 seconds to generate a signal more than the threshold value.