+ Working OpenBCI GUI on Raspberry PI 4B OS. Wearable OpenBCI projects should now be a breeze with downloadable pre-confiured disk image. Just download and write to a MicroSD then pop it in the PI and follow instructions on my google drive doc. Below is the full scope of the project. This is only the tip of the iceberg beginning!
Update 2021-May-27: Forgot I had recorded a video of the software in action to a degree. Can be seen at this YouTube link. I describe a bit of the code, the performance, etc. I am a bit behind on the planned schedule of releasing more publicly. I am writing up an outline for a dissertation on the the scientific literature that defines the theory behind the project and will be working to define experimental methods to assess our ability to maximize conscious experience within the limited lifespan we have. Any feedback/help as I post more is appreciated. Can shoot it to wmbroch @ gmail?.com (just remove spaces and ‘?’ mark so bots don’t spam me by chance and emails get lost.)
Overview
Goal of this project was to push the open-source OpenBCI hardware and GUI to align with some of the most cutting edge data processing and product capabilities across research fields and the market of wearable devices and BCIs.

Hardware
The equipment used for this project was:
- Cyton board (8-Channel)
- USB Bluetooth Dongle
- Raspberry Pi 4B 4GB (2GB likely ok too)
- Can also use Pi 3 most likely
- Waveshare Touchscreen LCD
- USB portable battery pack
- Haptics:
- DRV2605 Haptic driver
- ERM Motor (Amazon Generic) Uxcell brand maybe
Software
The key pieces of software / libraries were:
- Raspbian ARM 32 OS
- BrainHat installed (Only compiled version of Brainflow libraries working on RPI OS)
- Processing IDE for Raspbian
- OpenBCI GUI
Features
The main goal of this project is to first add a Widget to the OpenBCI GUI that can process live data using a Hilbert Analytical Transform (Similar to Matlab) to separate the Magnitude and Phase angle of the incoming data packets. This is processor intensive and currently with existing JDSP libraries I’m only able to process say 2.5seconds of data (@ 250Hz) in around 6 seconds. Not too bad. So for a live feedback loop that ends up allowing the widget to gather 2.5 seconds of data points from 4-8 of the electrodes and perform the transform / calculations and provide the user the haptic feedback about 6 seconds after gathering the data.
That all being said here is a small diagram to show the general goal to be able to measure the brain state using a combination of hilbert transformed data and fourier transformed bands (ex: Gamma, Theta, Alpha….) and provide haptic feedback whether a “selected” baseline state is being maintained or exceeded.
In the most simple example Hilbert transformed data allows for “Brain Entropy” to be calculated using a type of Lempel-Ziv compression algorithm. If the entropy value goes down relative to starting point the user is typically becoming closer to an anesthetized state. For anesthesia monitoring, it has been studied as a method of monitoring for nearly 30 years I believe reading through the research. There have been many many different methods of signal processing used to try and monitor anesthesia levels, but I’ve selected the Hilbert Transformed Brain Entropy for reasons that will become clear as the project progresses.
Once I had a basic version of that working, I used the processing.io libraries for GPIO pin communication using simple java code in the library. The GPIO pins are then hooked up to a DRV2605 Haptic driver and ERM motor for vibration and then other GPIO pins are hooked up to simple LED circuits for indicators of increase vs decrease in live Brain Entropy compared to baseline or the last “x” # of samples.
In this initial release of the guide, I’ll be posting the RPI OS image with the BrainHat installed for Brainflow libraries, the Processing IDE installed and the OpenBCI GUI installed which can be ran as a sketch in Processing IDE. The OpenBCI GUI has a very rough messy programmed proof of concept with a widget called “Lempel-Ziv Complexity”.
Soon after I’ll get a quick walkthrough of how to open and run the sketch and can go into more detail of how to “use” the Lempel-Ziv Complexity widget I’ve made.
After that, I’ll assemble a simple walkthrough of the hardware wiring and assembly to create a haptic feedback system and LED’s. Below is the GUI running on RPI 4B with Complexity Widget in the bottom right corner.
Videos will be provided in the coming weeks.
The walkthrough template I’ll be building on is on this google doc page which includes the raspberry pi OS image with everything installed and setup / working.
https://docs.google.com/document/d/1l0iAWgY8DDxRx3ZaHhvfzXuDp92mNk0KAknCOzWvA8w/edit?usp=sharing
Thank you! This is exactly what I was trying to do! I made a pi 4b 8gb gauntlet but was implemented with windows10arm and I realized it can’t run on that since the GPU driver was still not finished! However we are also using Neuropype which only support X86. Is there any way to implement X86 software?(we are using LSL and OSC)
My design may be helpful for you since I made a 3d printed gauntlet(based on a Mandalorian gauntlet) for it: https://b23.tv/Knx6BA
If you could use something like this it would even better. My project is about using Cyton and pi to control a robot.
Good job on the gauntlet design. One of the best ones I’ve seen for this. I saw a few others that inspired me to go that direction.
I’ll have to read up on Neuropype. Sounds if you are using LSL to pull the data you can use the same BrainHat SERVER developed by Graham Briggs to read your Cyton data on PI which is installed in the SD Card image I posted. Neuropype appears to be written in Python according to their website which would make it easy to access all your GPIO pins and the RPI 4B hardware in general. The features in Neuropype look very powerful.
I do have JDSP libraries assembled in the Processing IDE in this project, but all the sort of “canned” EEG DSP techniques in neuropype would be hard to beat if you have their source code to run the python on the PI. If so just run BrainHat Server LSL to read your Cyton data to the LSL and then execute your Python in a good IDE for Raspbian to verify your neuropype tools work.
In coming weeks or so I aim to put together walkthrough on how I’ve approached setting up to use the BrainHat and how all the Raspberry PI I/O’s are accessible in the JAVA running the OpenBCI GUI code. Probably won’t be much help in the Python department unfortunately. I think the BrainHat suite is being written in C# / C++ to throw another language in the mix. I’ll show how to use the JDSP plugin for some extra signal processing options as well as I go along.
On the other questions, I’ve considered X86 OS on the PI and even attempted the Raspbian ARM64 OS, but to maximize the GPIO, I2C, etc. etc. capabilities with ease I found it least work to leverage the Raspbian OS working builds (ARM 32bit). That means getting any libraries source code code compiled for Linux ARM 32bit to run. Raspbian calls it ARM71 (32bit), but hopefully some point they get a stable full featured release of arm64 OS. I don’t believe all the I/O pins work with ease in software the ARM 64 Raspbian yet essentially.
I will say the multi-core ARM 32bit is still fast at multi-thread operations. Beat out my Surface Pro 3 (not too impressive) at signal processing in the OpenBCI GUI.
I just some problems using my pi 4 4b 8gb version and it can not boot with your image:https://b23.tv/iHKwah
Is there anyway to solve it?
Looks like it not even booting Kernel. I’ll read through and see. I’ll assemble later today/tomorrow the links for Raspbian OS download fresh and the steps to install BrainHat, Processing IDE and JDSP signal processing. I’ll also verify/upload a new image to make sure I don’t see same errors.
Sir, I have been managed to install processing IDE and 5.0.4 gui manually and it successfully launched. However, problem came when I tried to connect the Cyton board to the pi. I tried bluetooth and the pi found it, but it said that “it doesn’t have any functions”. Also when using usb connecting to the pi there is also no coms shown by auto-scanning, so would you mind sharing how you managed to find Cyton on pi?
Dreambuild, yeah the GUI will not work without the Brainflow libraries installed. You may be able to just install the BrainHAT Client from the following guide by Graham B https://openbci.com/community/brainhat-viewer/
That will install the Brainflow libraries to the Rasbian root libraries directory.
If that doesn’t work I did re-image and compress the 64GB MicroSD card I am using.
https://drive.google.com/file/d/1LzdVPZOMukohUAIHBQX6oIBWrfdRc02w/view?usp=sharing
I believe the original upload was a corrupt img file. I also added a step by step in the guide to uncompress the image, load the the microSD card, boot up, and steps to startup the openBCI GUI. Can be found in the original google doc that I’ll continue to update next few days to have photo step by step.
https://docs.google.com/document/d/1l0iAWgY8DDxRx3ZaHhvfzXuDp92mNk0KAknCOzWvA8w/edit?usp=sharing
Sir, can we use wifi shield for this system? Our dongle board is now having issues and we current can only buy wifi shield from our location 🙁
I have not tried the WiFi Shield myself, but should be just as fast on the Rasbian / RPI board as other OS. Should be able to connect and get the faster sampling frequency. (Dongle I was running at 250hz but I think WiFi you can exceed 1000hz). I will be getting one eventually since the higher sampling frequency will greatly help consciousness complexity measurements I am doing.
I do highly recommend following the guide section “Running the OpenBCI GUI from the Processing IDE”
https://docs.openbci.com/docs/06Software/01-OpenBCISoftware/GUIDocs
in the openBCI documentation to replace my PDE java files in the Sketch folder with the latest ones from GitHub in this link.
https://github.com/OpenBCI/OpenBCI_GUI
The ones I have modified will run slower as the dataProcedsing.PDE file has added code to calculate Hilbert Transform and binary complexity via Lempel Ziv compression algorithm in multiple background processor threads. I wouldn’t overwrite any of the library files however to not mess with RPI being able to interpret them. Only PDE java source cord files from the Github.
Sir, have you ever considered Rock Pi X? I have successfully installed the gui on the x86 windows10 with the same size. And develop BCI application seems to be easier on x86. Like my team’s software only support X86 platforms.
Here is how it looks like.
> 
> 
> 
Just got my post re-aproved with updated video walkthrough of the Java mods and widget. Can finally respond.
Rock Pi X does look good. Like your setup. Would be useful to have all the windows based signal tools and I/O available. I just worry Windows OS wasting RAM and CPU cycles, albeit I’m no great programmer to Begin. I’d need help to be making any custom GUI or things like that.
I’m sure single board x86 systems will become more popular soon.