+ Working OpenBCI GUI on Raspberry PI 4B OS. Wearable OpenBCI projects should now be a breeze with downloadable pre-confiured disk image. Just download and write to a MicroSD then pop it in the PI and follow instructions on my google drive doc. Below is the full scope of the project. This is only the tip of the iceberg beginning!
Update 2021-May-27: Forgot I had recorded a video of the software in action to a degree. Can be seen at this YouTube link. I describe a bit of the code, the performance, etc. I am a bit behind on the planned schedule of releasing more publicly. I am writing up an outline for a dissertation on the the scientific literature that defines the theory behind the project and will be working to define experimental methods to assess our ability to maximize conscious experience within the limited lifespan we have. Any feedback/help as I post more is appreciated. Can shoot it to wmbroch @ gmail?.com (just remove spaces and ‘?’ mark so bots don’t spam me by chance and emails get lost.)
Goal of this project was to push the open-source OpenBCI hardware and GUI to align with some of the most cutting edge data processing and product capabilities across research fields and the market of wearable devices and BCIs.
The equipment used for this project was:
- Cyton board (8-Channel)
- USB Bluetooth Dongle
- Raspberry Pi 4B 4GB (2GB likely ok too)
- Can also use Pi 3 most likely
- Waveshare Touchscreen LCD
- USB portable battery pack
- DRV2605 Haptic driver
- ERM Motor (Amazon Generic) Uxcell brand maybe
The key pieces of software / libraries were:
- Raspbian ARM 32 OS
- BrainHat installed (Only compiled version of Brainflow libraries working on RPI OS)
- Processing IDE for Raspbian
- OpenBCI GUI
The main goal of this project is to first add a Widget to the OpenBCI GUI that can process live data using a Hilbert Analytical Transform (Similar to Matlab) to separate the Magnitude and Phase angle of the incoming data packets. This is processor intensive and currently with existing JDSP libraries I’m only able to process say 2.5seconds of data (@ 250Hz) in around 6 seconds. Not too bad. So for a live feedback loop that ends up allowing the widget to gather 2.5 seconds of data points from 4-8 of the electrodes and perform the transform / calculations and provide the user the haptic feedback about 6 seconds after gathering the data.
That all being said here is a small diagram to show the general goal to be able to measure the brain state using a combination of hilbert transformed data and fourier transformed bands (ex: Gamma, Theta, Alpha….) and provide haptic feedback whether a “selected” baseline state is being maintained or exceeded.
In the most simple example Hilbert transformed data allows for “Brain Entropy” to be calculated using a type of Lempel-Ziv compression algorithm. If the entropy value goes down relative to starting point the user is typically becoming closer to an anesthetized state. For anesthesia monitoring, it has been studied as a method of monitoring for nearly 30 years I believe reading through the research. There have been many many different methods of signal processing used to try and monitor anesthesia levels, but I’ve selected the Hilbert Transformed Brain Entropy for reasons that will become clear as the project progresses.
Once I had a basic version of that working, I used the processing.io libraries for GPIO pin communication using simple java code in the library. The GPIO pins are then hooked up to a DRV2605 Haptic driver and ERM motor for vibration and then other GPIO pins are hooked up to simple LED circuits for indicators of increase vs decrease in live Brain Entropy compared to baseline or the last “x” # of samples.
In this initial release of the guide, I’ll be posting the RPI OS image with the BrainHat installed for Brainflow libraries, the Processing IDE installed and the OpenBCI GUI installed which can be ran as a sketch in Processing IDE. The OpenBCI GUI has a very rough messy programmed proof of concept with a widget called “Lempel-Ziv Complexity”.
Soon after I’ll get a quick walkthrough of how to open and run the sketch and can go into more detail of how to “use” the Lempel-Ziv Complexity widget I’ve made.
After that, I’ll assemble a simple walkthrough of the hardware wiring and assembly to create a haptic feedback system and LED’s. Below is the GUI running on RPI 4B with Complexity Widget in the bottom right corner.
Videos will be provided in the coming weeks.
The walkthrough template I’ll be building on is on this google doc page which includes the raspberry pi OS image with everything installed and setup / working.