This past weekend, OpenBCI sponsored & participated in MIT Reality Hack 2025. MIT Reality Hack is an annual event at MIT that brings together thought leaders, engineers, designers, and entrepreneurs to create innovative prototypes in experiential technology. Participants from diverse backgrounds collaborate to develop novel software and hardware solutions for XR devices and platforms.
Joe, Zoe, Ranya, and Isaac drove up from NYC with a car full of neurotechnology early Thursday morning and spent the weekend working closely with the hackers. Since we only had a limited amount of gear, we had hackers apply as a team to use Galea and other OpenBCI products. The amount of interest was overwhelming, especially for Galea. We ended up contributing two Galea systems instead of just one, along with several Cytons, electrode caps, and other dev kits. The next two and a half days were a whirlwind of creativity, technical acrobatics, setbacks and breakthroughs. We were amazed at what some of the teams were able to accomplish, with several teams tackling entirely new technical challenges, such as how to stream data from the OpenBCI GUI straight to a standalone VR headset.
Ranya Belmaachi presented at the opening ceremony and Joe Artuso gave a talk at the EXPERIENTAL conference on how the convergence of AI, BCI, and Spatial Computing will form the basis for the next generation of computers.
Our favorite part of the event was getting to go around on Sunday and see the final projects. We also loved the fact that so much code was added to GitHub for others to use as a starting point. Three teams that used OpenBCI also took home hackathon gold! Without further ado, here’s the roundup of all the projects that used Galea, OpenBCI and EmotiBit.
Neuroveil: The Twin Mind Interface
🏅WINNER: Best Use of Qualcomm Technologies with AI
Neuroveil synchronizes brainwaves between participants, allowing them to converge to a shared brain state, and control that brain state to elicit certain effects. This team used custom glasses that provided optical stimulation in order to augment the EEG activity of a pair of users. As the user’s EEG power bands became more aligned, several forms of physical and digital visualization would demonstrate the level of convergence in their brain states. This was definitely one of the more ambitious projects and their demo presentation had an excellent mix of artistic flair and technical razzle dazzle.

The Team: Adam Sauer, Max Turnquist, Page Patterson, Brian Chung, Rui Ma
Project Page: https://devpost.com/software/neuroveil-the-twin-mind-interface
OPTL
🏅WINNER: Best use of OpenBCI
The OPTL team was inspired by a critical question: “How can we enhance human performance in high-stakes environments while leveraging cutting-edge technology?” This was one of the two teams that used Galea and we were impressed with how easily they were able to get Galea integrated with Cognitive3D, Unity, and Lambda Cloud. Their end result was a convincing picture for how biosensing data can be used to track training performance and enhance outcomes. Certainly one of the more “business ready” projects that we mentored during the hack!

Team: Elan Grossman, Cyril Medabalimi, CJ Connett, Jordan Clark, Peter Zhang
Project Page: https://devpost.com/software/optl
Neuroscent
🏅WINNER: Hardware Hack Smart Sensing
The Neuroscent team combined VR, aroma and biosensing into a single experience. When this team first proposed their idea, we thought it was waaaay too ambitious to hack the smell-generating hardware together in the limited time allotted. They were able to get it all working by repurposing parts from two cheap diffusers bought at CVS! An impressive hack, that came together in the final hour for a unique demo experience. To promotes user’s mental well-being based on multimodal human sensing, they created an XR biofeedback system that incorporates olfaction (scent) along with the Varjo head-mounted display (vision) and OpenBCI Galea.
Team: Ashley Neall, Kriti Gupta, Peter He, Grace Jin, Ximing Luo,
Project Page: https://devpost.com/software/neuroscent
CoFeel
“By translating the unique experience of pregnancy into a shared visual and tactile interaction, CoFeel fosters deeper emotional connection, transforming pregnancy into a journey experienced together rather than in isolation.”
This was another extremely ambitious and very cool hack. The team was in the running for Best In Show and frankly we think they deserved all the awards. They used EMG to monitor a pregnant individual’s abdomen to track fetal movements and EEG sensors to monitor neural signatures of emotion, then they used those signals to power unique visualizations in a connected VR headset. The idea was that a father, or other observer, could experience what was happening inside the mother’s body in an entirely new way.

In addition to visualizing fetal kicks and emotional states in VR, users can interact with a virtual representation of the baby. These interactions are conveyed back to the pregnant individual via an LED vest, creating a bidirectional experience. It was very obvious to see how the biosensing data was powering changes in the VR environment, and we were amazed that the team was able to get the OpenBCI data streaming into the Oculus Quest.
Team: Daniel He, Daniel Kaijzer, Hongxuan Wu, Jiaqui Wang, McKena Geiger
Project Page: https://devpost.com/software/cofeel
Nudge Pet
NudgePet used the EmotiBit to power a mixed reality solution that brings the benefits of emotional support animals to anyone with a mixed reality device.
NudgePet is a mixed reality widget that runs in the background of spatial activities, providing users with a virtual emotional support animal. Using EmotiBit’s Electrodermal Activity (EDA) sensors, NudgePet monitors the user’s stress levels and emotional state. When it detects heightened stress or negative emotions, the virtual pet – in our prototype, a dog named Snuffles – appears to provide comfort and guide the user through emotional regulation exercises.

Very cool use of EmotiBit and definitely one of the best team mascots at the hack!
Team: Steven Le, Audrey Lane, Emmanuel Angel Corona, Ethan Johnson, Jessica Sheng
Project Page: https://devpost.com/software/nudgepet
True Form
The True Form team used the OpenBCI Cyton to monitor a user’s EMG signals and incorporate that into health and fitness training applications. Their biofeedback smart sleeve used a combination of tech: 1) Computer Vision feeds the system data about joint position. 2) EMG (electromyography), measures electrical activity in muscles. 3) Haptic feedback guides the wearer to adjust their position into the ‘trueform’ position.
The NASA-inspired mesh materials that they prototyped were extremely flexible and form-fitting, while providing a lot of real estate for sensors. Lots of cool ideas in this one, and a great project video!

Team: Atlas Talisman, Jordan Louie, Seyram Gbeblewou, Aleksander Talamantez, Albert Hodo
Project Page: https://devpost.com/software/trueform
Panic Button: Real-Time Immersive Panic Management
Panic Button is an immersive therapy tool designed to detect, monitor, and respond to stress and panic in real time. Its key features include: Real-Time Monitoring: Tracks brainwave activity (EEG), heart rate (ECG), and body discomfort using advanced sensor technology and an AI-enabled camera.
The Panic Button project was inspired by a growing need for tools that can provide immediate, evidence-based relief during moments of acute stress or panic attacks. Traditional therapeutic approaches often lack the ability to adapt dynamically to a user’s state in real time, leaving a critical gap in moments of urgency.

Team: Salman Chishti, Ritu Bhattacharya, Steven Christian, Curiosibot (mysterious!)
Project Page: https://devpost.com/software/panic-button-1it7wr
More Photos!









Conclusion & Lasting Impressions
The hackathon was a great stress test of OpenBCI’s products and documentation. We learned a ton about our getting started experience for new users and are already implementing a bunch of improvements to our docs as a result. We’re excited to bring Galea and OpenBCI gear to more events like this!
Leave a Reply
You must be logged in to post a comment.