A widget that can detect when eyes are closed
MahaN
Dubai, United Arab Emirates
in OpenBCI_GUI
I'm building a prosthetic controlled through the openBCI headset and want it to do two things: pick up a cup when I concentrate (using the focus widget) and shake someone's hand when I close my eyes for a few seconds - is there any built-in widget or custom widget someone has made that I can use to detect the closing of eyes? Any help would be appreciated, thanks.
Comments
Maha, hello.
The Focus Widget can already do eyes closed detection ('Relaxation'). Alpha wave production increases substantially when eyes are closed. This is especially true in the posterior portions of the scalp, such as 10-20 sites O1 or O2 or Pz.
Alternately you can also use the Networking Widget, to select the alpha band power,
https://docs.openbci.com/Software/OpenBCISoftware/GUIWidgets/#networking
William
Ah I see, I didn't realise the focus widget overlapped with eyes closing... thank you for that information. When looking into the openBCI GUI, I couldn't open both the relaxation and concentration readings at the same time so I'm struggling to receive two distinct signals that could correspond to two different actions for the prosthetic. Could you please provide suggestions on another signal which could be easily detected from a widget that doesn't overlap with concentration/relaxation such that the intention of a handshake doesn't get mixed up with the intention of grabbing a cup?
You can use the Brainflow library from a Python program that you create. This allows you to do any combination of signal processing you desire.
https://brainflow.readthedocs.io/en/stable/Examples.html#python-band-power
https://brainflow.readthedocs.io/en/stable/Examples.html#python-eeg-metrics
In the last comment we mentioned the Networking Widget. And/or the Band Power widget This widget and the Focus Widget can all be used simultaneously.
Hi there!
Are there any courses you would recommend to understand brainFlow and neurosignal processing in Python? (The brainflow documentation is going straight over my head with all the jargon but I really want to understand it). Any guidance would be appreciated.
I would suggest simply studying, running the Brainflow example programs. These are connecting the board and processing the stream of sample data that is received.
https://www.google.com/search?q=python+digital+signal+processing
https://github.com/unpingco/Python-for-Signal-Processing
https://open.umn.edu/opentextbooks/textbooks/290