More people than ever are riding their bikes as a way to stay fit, avoid crowded public transit, and help the environment. The dangers of cycling, especially in urban environments, are well known, but what if your bike could provide a safer ride by monitoring changes in your field of view while cycling?
Researchers from Monash University’s Exertion Games Lab, IBM Research – Australia, and the University of Southampton’s Wellthlab, have developed a new brain-connected eBike that regulates engine control based on the rider’s EEG activity.
“Ena” is a novel EEG-eBike system that draws from the user’s neural activity to determine when the user is in a state of peripheral awareness to regulate engine support.
Ena uses the OpenBCI Cyton and EEG cap to detect activity in the rider’s occipital lobe, which governs visual processing. The system is connected to a computer in the rider’s backpack which processes the EEG data and determines when the rider is in a peripherally aware state.
When the user is aware of a larger field of view, the ebike’s engine support is engaged. When the rider’s field of view narrows on an incoming obstacle, the bike’s engine support is cut off. This increases rider safety by providing more time to react.
We sat down with lead researcher Josh Andres to discuss his project, inspirations, and future plans.
Q: Where did the idea for this project come from?
JA: I have been exploring the design space of human-machine integration by using electric bikes (eBikes) for several years. eBikes present an excellent lens for human-machine integration because the user is investing physical effort while navigating the environment, and the eBike can be easily modified to extend the abilities of the rider. For example, in our first experiment, I used a smartphone’s gyroscope and accelerometer to determine the bodily posture of the rider and used the act of leaning forward as a way to increase engine support. This afforded the rider the sensation of having super-strength. On the second experiment, I connected the eBike to traffic light changing patterns and used an accelerometer and GPS on the eBike to determine the speed and location of the eBike in relation to the traffic light changing patterns. This bike could either increase engine support when the rider was going to miss the next light on green, or it could whisper in the rider’s ear to ‘slow down a little’ if they were going too fast. This afforded the rider the experience of working together with the eBike to get traffic lights on green.
At this point in my research, I had explored data generated on the user’s body (leaning forward), and also, from around the user’s body (traffic light status). So on this next experiment, I wanted to focus on data from inside the user’s body to explore human-machine integration. Working together across the three institutions, Monash University Exertion Games Lab, IBM Research – Australia, and the University of Southampton Wellthlab, we explored what data from inside the body would make sense to support the experience of cycling, and, how could we use this data to craft an integration experience. These questions led us to the idea of investigating peripheral vision as it is paramount to being aware of one’s surroundings and therefore navigating and responding to the changes in the environment. Inspired by work from sports science and neurology, which had been studying the corresponding neurological state for peripheral awareness [1, 2], we borrowed their teachings to create our system which offers a first-of-its-kind peripheral awareness as a neurological state for human-computer integration approach.
Q: Can you explain the connection between EEG activity and peripheral vision? How were you able to tell when a rider was “peripherally aware”?
JA: Previous work in sports science and neurology had already identified the correlation between peripheral awareness and EEG amplitudes in the high alpha band (10–12 Hz). The target values for determining peripheral awareness in our work were established by taking the mean voltage values exhibited by individuals in a state of peripheral awareness in previous studies [1, 2], and creating a range of two standard deviations from the mean. To connect the participants’ neural electrophysiological signal with our prototype, we used an EEG system composed of the OpenBCI Cyton, and Ag/AgCl coated electrode cap.
To assess the participants’ engagement in peripheral awareness, the calculations were performed in real-time while the participant was riding our prototype. When participants’ values fall between 0.76μV-1.19μV within the high alpha range of 10-12Hz and 0μV-0.7μV within the beta range of 12-13Hz, our software infers that the participant is in a peripherally aware state. Values falling outside these parameters indicate that the participant is not peripherally aware. The addition of beta is used in reference to alpha to ensure signals that reached the desired alpha pattern were not a product of noise across all bandwidths. This was further complemented by the use of a mean smoothing filter to mitigate movement artefacts. Lastly, the values were used to calculate an output Boolean of “true” when participants were peripherally aware, and “false” when participants were not.
Q: Your study involved 20 riders actually using the Ena Bike. What did you find most interesting about their reactions & interviews?
JA: From the study, we derived various themes and tactics to design integrated exertion experiences using peripheral awareness as a neurological state. Some of the highlights from the work are that changes in a user’s field of view relating to peripheral awareness can be read in real-time to gain access to a user’s pre-attentive state. This means that changes in neurological activity can be caused by instinctive reactions, such as a car passing by, or a sudden obstacle on the road when navigating the environment. These instinctive reactions cause our field of view to narrow to focus on the obstacle and our system can pick this neurological activity change in real-time to act on the situation. In our case, stopping the extra engine support faster than a user could on their own. This allows the rider more time to respond to the situation, something the riders noticed and commented on in interviews:
“There’s a minor moment of panic where you realize, ‘Hey, I need to quickly find a way to avoid this incoming thing’, that is when the bike slows down and it gives you time to think”
The other main takeaway is that the system is connected directly from the user’s brain and the user does not need to “think” how to raise the attention of the system through a command as is the case with many devices. In our case, Ena allows for data to flow between the user’s visual perception of the environment and the control of the engine to offer an experience in which the user and the system are working as one.
Q: How did EGL first learn about, or start using OpenBCI tools?
JA: We were looking for a BCI platform that we could modify to design our prototype while being able to get research-quality electrophysiological recordings.
After reviewing a few different systems OpenBCI stood out as being easy to play with and it also offered a GUI to study EEG recordings. To inform the design of our prototype and how to connect the different pieces we used your documentation.
Q: What are you working on next?
JA: There are several scenarios where the technology like Ena Bike could be beneficial, from increasing safety and response time for emergency personnel to potentially monitoring a patient’s peripheral vision to learn about a condition, right through to being used in sports to help soccer players develop their peripheral vision.
We have various ideas around health applications through a human-computer integration approach that we are beginning to define. We are also interested in how people gain “perception” and how we can use different sensing inputs of the human body to facilitate the user to explore data through those other sensory modalities.
To learn more about Ena:
Also check out another recently featured EGL project from Ena co-author Nathan Semertzidis: Neo-Noumena: Seeing Emotions with AR & OpenBCI