EEG-based emotion recognition - Stress, Excitement, Engagement, Focus, relaxation, Interest
I want to classify these emotions, mentioned in the caption. I used Emotiv epoc+ before and that device and determine the estimate these emotions while wearing the device. But I could not access raw data with that device. That is why I am interested to buy opnBCI and want to do the same thing. But I don't understand with open BCI how I can do the classification. I have read some papers, they normally classify data with these labels: pleasant, sad, happy, frustrated. My target is to collect data in a specific environment (online class) and then recognize and measure these emotions. How I can do it? The question could be numb, but again, in summary, I want to build a custom emotion classifier based on EEG data( will be collected by open BCI device).
Comments
Kimohs, hi.
There are some related posts under the phrase "affective bci", here on the forum. This search produced using the Google Advanced Search button in upper right,
https://www.google.com/search?as_q=affective+bci&as_sitesearch=openbci.com
Also many papers online,
https://www.google.com/search?as_q=affective+bci
Regards, William
I have looked into it but I am more interested to know how to label those data with the mentioned emotions, then train the data then classify it, and which channel provides what types of data, basically what I mean I want to understand the data in the different channel?
re: "which channel provides what types of data", I'm not sure it is that straightforward. Each different group researching this area takes a different approach.
re: "train the data then classify it...", it's not clear that ML / DL is being widely applied to affective BCI.
There is no one 'formula' with the perfect classification scheme. Are you aware of the different types of papers published? Some categories include research, clinical, review. The review papers contain literature comparisons of a range of papers in that subject area. Found this one from about 5 years ago:
https://www.researchgate.net/publication/283448544_Extracting_neurophysiological_signals_reflecting_users'_emotional_and_affective_responses_to_BCI_use_A_systematic_literature_review
If you look that over and the type of EEG classification they were using, you might find some ideas to further delve into. Here is another review paper, possibly even better, from 2017,
https://www.mdpi.com/2076-3417/7/12/1239/htm
This paper concludes that Emotiv's emotion classification system is "is not reliable in detecting emotions as no significant relationships were found between participants’ self-reported emotional states and EPOC emotional values."
https://pdfs.semanticscholar.org/0fa5/6e3819865f5c10d4e021ef1541ba741187f2.pdf
Other related search links,
https://www.google.com/search?q=accuracy+of+emotiv+epoc+classification
Kimohs did you get anywhere with this. I'm about to start experimenting using the DEAP dataset.
http://www.eecs.qmul.ac.uk/mmv/datasets/deap/
Any pointers anyone has (on how to implement an ML system with OpenBCI generally and emotion recognition specifically) would be very welcome.
To add, this is for an artwork that is part of the Downloadable Brain season of art events in the UK
https://www.cognitivesensations.com/
I'm as interested in the failure to recognise emotional states as being able to recognise them.
This has already been done with the OpenBCI headsets, in conjunction with an Empatica E4 wristband, and it has been published in a very reputable journal. The Empatica E4 wristband is a very useful device that measures electrophysical signals such as electrodermal activity (sweating), heart rate (and also the waveform of the capillary pulse), peripheral skin temperature, and motion.
See article: https://ieeexplore.ieee.org/document/8762012
The entire code is available here: https://github.com/IoBT-VISTEC/EEG-Emotion-Recognition-INTERFACES-datasets
The emotions that are classified in the program:
The problem is that the emotion detection is not "live". You would have to create Lab Streaming Layer streams for both the OpenBCI headset and the Empatica E4 (both of which are available), and then figure out how to integrate the live data with the already available code--which would need to be partially reconfigured for the live data inputs.
Also, the Empatica E4 is expensive, although it is a great and very useful device. You can do without, if you utilize transfer learning (an artificial intelligence technique) as this code utilizes artificial intelligence . This basically lets you recalibrate the original AI code for whatever device(s) that you choose to use. However, there is a drop in accuracy.
SecretSquirrel Thanks! Have you done this? It sounds as though you might have...?
Instead of the $1700 Empatica, consider using Emotibit instead ($200). It was designed by Sean Montgomery, staff member here at OpenBCI.
https://www.emotibit.com/
There is a comparison chart about halfway down the page, comparing Emotibit, Empatica, Fitbit, Apple Watch, Shimmer, and Biopac.
William
Also see this recent related thread,
https://openbci.com/forum/index.php?p=/discussion/2942/emotion-detection-using-openbci
https://www.kickstarter.com/projects/emotibit/emotibit-0?utm_source=kickbooster-direct&utm_medium=kickbooster&utm_content=link&utm_campaign=a877f1f9
5 days left in the Kickstarter campaign for Emotibit, and this is the first time it is being sold/pre-ordered.
It will work with LSL from the start, very well, most likely, and that is a reason to get it. I do have some reservations about the device, but it is almost 100% of the time a better choice over the Empatica E4, at least for users on here.
Thank you so much for this information! I am going to order an Emotibit tonight! :-)
You are awesome! :-)
I do a lot of work around physiological signals of the human body. I also do a lot of work around mathematical physiology, utilizing systems of differential equations. I design control systems including artificial pancreas systems (automated insulin dosing systems used for people with type 1 diabetes) in my free time and I play around with them via simulations.
With respect to the physiological signals, I created stress alert systems along with basic affect classification (positive/negative valence) by reprogramming peer-reviewed journal articles. These are Artificial Intelligence based systems. For BCI stuff, only emotion and affect classification systems use Artificial Intelligence, so it is kind of niche for OpenBCI work. The devices that I have generally used are the Empatica E4 and the Respiban Professional, as many AI datasets have been created off of those devices.
I have 2 rare immune-mediated neurological diseases affecting my peripheral nervous system, plus type 1 diabetes. Collectively, I have 3 diseases that affect and damage my autonomic nervous system, specifically. My body does not regulate the most basic things that everyone takes for granted. I also experience life-threatening events due to this lack of body regulation, which is picked up by these sensors. So, I have uses for these devices, but they are clunky.
Anyways, I am currently redesigning the system that I am working with. I am basically making my own Astroskin which costs about $10,000 for a work week's set of undergarment monitoring clothes, which is obscenely expensive. I can make an entire week's set of this for about $300. I am not going to use a dinky headband for the photoplethysmography (PPG) sensor. Instead, I am using a pulse oximeter module with a resin ear mold, which will stay in my ear, unubtrusively and without interference, for monitoring. This is also much better than a finger or earlobe probe. To make the datasets work with the new sensors, I am then going to use transfer learning to allow the use of the sensors on my personal DIY "Astroskin" with the datasets I already have.
Note: Hexoskin is a similar, much cheaper, and more reasonably priced technology. However, I need ECG and photoplethysmography features because I need constant, 24/7 non-invasive blood pressure monitoring. I get life-threatening blood pressure surges called autonomic dysreflexia. In case you look it up, yes, I can walk and I look completely normal.
There are many kinds of emotion classification, and the mainstream is 6 classification.
There are some open source emotion recognition projects on GitHub for reference.