I am excited to share with the community the highlights of my two-month internship here at OpenBCI! I’ve gotten the opportunity to be involved with a variety of different things around the office, but have spent a lot of my time putting together some cool demos and projects using the in-house OpenBCI gear. Some of my favorites included a brain-controlled Tic Tac Toe game using the MindAffect headset, hacking a helicopter to make it fly using concentration, and creating a brain-controlled Tetris game from scratch.
I learned a lot through these projects, including the specifics of different data networking options, dealing with EEG data using Python and Brainflow, and also the general trial and error process that comes with designing and engineering a project from start to finish.
Tic Tac Toe Project
In the Tic Tac Toe project, I was able to put together a pipeline of collecting and processing data from steady state visually evoked potential (SSVEP) and using the neural responses to determine where the user wanted to place their next move. I was able to implement this using the MindAffect headset, OpenBCI Ganglion amplifier, and the MindAffect documentation code. The game started with a calibration period where the user would look at different frequencies of flashing light at each of the 9 squares on a tic tac toe board. This calibration data would be used to determine the specific neural patterns that appeared in the data when a user looked at the different squares. Then, when a user wants to place an X in a square, they will look at that square with the specific frequency of flashing light, from which the model would match the neuronal patterns with that from the calibration period to determine where the user was looking and would like to place their next move.
This project was a great introduction for me to get hands-on experience with dealing with the brain data in Python to actually manipulate a program according to the user’s inputs. In the future, I hope to continue to work on improving the accuracy of determining where the user is wanting to place their next move by better understanding and dissecting the model by which the calibration and SSVEP paradigm is used.
The next project I worked on was the Concentration-controlled Helicopter project. This project was very complicated, but also a very informative and exciting experience. From the get-go, I had complete creative control over nearly every aspect of the engineering process. The goal of the project was to be able to fly a regular toy helicopter using the UltraCortex and the Concentration Widget in the OpenBCI GUI.
Before deciding exactly how to approach the project, I decided to test out the Concentration Widget by connecting it to a breadboard, Arduino, and LED to get it to fade with increasing and decreasing concentration. This allowed me to get familiar with the data outputting format from the GUI and the general pipeline including Arduino.
From here, I had an important decision to make on how to approach the design of the rest of my project. After some brainstorming, I realized that I could approach the project one of two ways: either reverse engineer the IR LED mechanism of the remote controller or hack directly into the remote control itself and send the appropriate voltage to the right pins on the circuit board through some processing Arduino code. Both approaches would have their benefits and drawbacks, whether that be the amount of time it would take to determine the appropriate IR LED patterns to send a signal to the helicopter’s receiver or the potential difficulties that would come with trying to understand the industrial level circuit board that was powering the different mechanisms of the remote controller.
Ultimately, I decided to go with the approach of hacking directly into the remote controller. The first step was to take apart the controller and determine what each pin of the circuit board connected to. After this, I was able to determine that there was a potentiometer connected to the vertical joystick that was sending a different voltage depending on the position of the joystick. I thought that a good approach would be to remove this potentiometer and use the Arduino’s digital output pins to send voltages in similar ranges to mimic the work of the potentiometer. Some problems I ran into along the way included dealing with the Arduino’s PWM wave digital output, determining how to work in the highly sensitive microvolt range that the controller was in, and dealing with varying battery source levels.
From this project, I definitely was able to learn a lot about the importance of documenting my steps and was able to tie together a lot of my electronics and coding knowledge to understand and reverse engineer the components of the controller. More generally, I also learned a lot of valuable lessons about the ups and downs of the engineering process, and definitely had some great “ah-ha” moments as different pieces of the puzzle came together one by one.
Ultimately, I was able to complete the hack successfully and have the helicopter fly up (without hitting the ceiling by setting an upper threshold!) when the user is concentrating and down when the user loses concentration. I also had the opportunity to work with Ann Makosinski to create a cool video demo of the final project, which will be out soon!
My final project was an EMG-Controlled Tetris Game using the OpenBCI Cyton and UltraCortex Headset. During my last two weeks, I was excited to put together a project completely from ideation to final product. Tetris was a great de-stressing game for me in high school, so I thought it would be a fun idea to work with the different capabilities of the Cyton to be able to play it completely hands free.
In terms of controls, I decided to use the Cyton’s accelerometer to detect a tilting movement in the left, right, and down direction to move the piece left, right, and dropping it, respectively. Using the pyAutoGui library, I was able to map each of these processes to the corresponding keyboard click. I was excited to create a demo that utilized the accelerometer, since many of the projects and tutorials in the past did not highlight this feature. To rotate a piece, I decided to use eye blinks. We can detect the eye blinks using the flat electrodes on the forehead with the UltraCortex.
I really enjoyed learning about the whole pipeline from UDP output to processing the data and coding that all on my own. With this project, I learned that it was really important for me to keep track of what the data looked like at each point along the way, from in the GUI to what was being sent over UDP to what was being processed with the final movements, since it was a multi-stepped process. I also learned a lot about different industry standard networking protocols, like LSL and OSC, before finally settling on UDP. I learned how to create sockets and listeners, and how that entire framework worked in conjunction with the data output from the GUI. From there, I was able to write the code to detect three-directional movement and blinks, and then mapped each control to the appropriate keyboard key needed to play Tetris (up, left, right, space bar, etc.)
In the future, a cool feature would be to incorporate the Concentration Widget to also change the speed of the game depending on how focused the user is, further intensifying the Tetris flow that is commonly associated with the game.
Towards the end of my Tetris project, I was also able to dive a little deeper into the different algorithms that exist to detect blinks. I began to do some preliminary processing work to detect blinks in BrainFlow following papers like this, but the different research that went into blink detection became a continued interest of mine after this project. It was interesting how something that seemed so distinct in the first two channels of the GUI time series had so much research going on behind it to detect. While I did get to play around with some preliminary methods of detecting these peaks with a sample window and detecting a jump over a mean threshold, I am interested in continuing to learn more about different blink detection algorithms and implementing them to see what they reveal about the way that I was processing the data.
Beyond these specific projects, I also got a chance to really understand the rapidly growing world of neurotechnology, surrounded by innovators who were pushing the boundaries of what could be done on the daily. Whether it was learning more about the intricacies of Galea or about the nuances of the firmware development process, I was exposed to so much of the behind the scenes of neurotech innovation. It was such an inspiring environment to be in and such a unique incubator for fueling my continued curiosity for the future of neurotechnology.
I am forever grateful for such a transformative summer at OpenBCI!