Controlling a stock electric wheelchair

DaveDDaveD Phoenix, AZ

I have a son with Duchenne Muscular Dystrophy and he only has limited finger movement left. He uses an electric wheelchair which has an Omni display with an RNet controller. I have received the specifications for interfacing with the display like a joystick. My thought is to use a Raspberry Pi, or Arduino, for output to the Omni and use an OpenBCI as input. Due to the headrest, I do not think the 3D printed sensor headgear will work for him. I think I might be able to put the sensors on some sort of shower type cap so he can sit in his chair, with the headrest, comfortably.

If the input can work, I believe that the output can emulate a sip and puff control. This will allow my son to run the chair with no different control issues than a person using sip and puff on a wheelchair. Does this sound like something that is doable?

Thank You,

David Demland

Comments

  • wjcroftwjcroft Mount Shasta, CA

    David, hi.

    You mention:

    ... I have received the specifications for interfacing with the display like a joystick. My thought is to use a Raspberry Pi, or Arduino, for output to the Omni and use an OpenBCI as input...

    In general, emulating a joystick or cursor control with your own EEG / BCI code, is not easy to do. It has been done in the past, but varying degrees of success. A better approach would be to utilize this exceptional and free open source project, called MindAffect. It can work with Raspberry, and a separate small display (probably not your Omni display, because there are specific timing constraints.)

    https://mindaffect-bci.readthedocs.io/en/latest/
    https://github.com/mindaffect/pymindaffectBCI
    https://www.mindaffect.nl/

    Regarding the headband they use: it works really well and is perfectly adapted to the MindAffect BCI system. It is based on 'saline' or water soaked pads, and a 3D printed band. There are some 'dry' sensor headbands in the Shop page, but the MindAffect band works better because of the decreased skin resistance (impedance) with the wet connection.

    Regards, William

  • wjcroftwjcroft Mount Shasta, CA

    Here are some other MindAffect demo videos:

    https://www.google.com/search?q=mindaffect+youtube

  • DaveDDaveD Phoenix, AZ

    Thank you for your help. I just received the technical specifications of the Omni display and this is what I have found out. The interface to the Omni display is a 9 pin din connector. Using a digital mode, I just need to output voltage to one of four pins. Each pin will do one of the following: forward, reverse, left, and right. I read about a project at the University of Minnesota where they were controlling drones just by thinking of making a fist. (https://newatlas.com/university-minnesota-mind-control-uav/27798/). If I wanted to just do the four operations, is there something already created that could handle the BCI and output commands to a Raspberry Pi, or Arduino? I am sure it would not be hard to have the Raspberry Pi, or Arduino output the voltage needed for each of the four pins on the display.

  • wjcroftwjcroft Mount Shasta, CA

    Dave, hi.

    Did you read my previous posts? Your EASIEST option is with MindAffect. This BCI system outputs time coded images to a display. Each 'button' on the display is essentially flashing at a different rate. When the operator places attention on that button, MindAffect detects that and then can take whatever action you choose. The name for this type of BCI is 'VEP' visual evoked potentials. Specifically, cVEP, code based VEP. Your display panel of 'buttons' can contain any choices you desire. For example left, right, forward, reverse, stop, speed select, etc. You can also show a full typewriter keyboard and allow for typing text. Since MindAffect can run on Raspberry, you have logic level output pins that could drive your Omni motor control.

    The UM system you mention uses what is called MI 'motor imagery'. Typically it can detect only two or three actions: imagined movement of left or right hand or left or right leg. And combined left+right. Thus control strategies are much more limited. If you watched the video you can see how fast the MindAffect reacts to the visual attention. MI systems are generally slower and potentially more error prone. The UM link you gave uses a 64 channel EEG system, which is prohibitively expensive, typically only used by research labs; cost likely over $10K.

    MindAffect can use the 4 channel Ganglion board (connected to the Raspberry). So needs much less in the way of EEG, yet offers much more in terms of flexibility. Surely your son would want to text or email with his friends as well as move the wheelchair. MI would not be able to provide a typewriter input keyboard. You have full flexibility with MindAffect to construct whatever menu systems / keyboards you want.

    With the Raspberry you will want to have a separate small HDMI display mounted. Possibly as a secondary display attached to the other display already in place. This is where the flashing button menus are displayed.

    Regards, William

  • DaveDDaveD Phoenix, AZ

    Thank you and I did look at the video. Problem with this solution is that if there is a screen in front of him so he cannot see where he is going. This would be a real problem when crossing the street or moving around a restaurant. Currently he is using eye gaze technology by EyeTech Digital systems for running his computer so he can already E-Mail and surf the net. I just want him to be able to "get out" and that is the reason I am trying to find a way for him to drive his chair in public settings without an obstructed view.

  • wjcroftwjcroft Mount Shasta, CA

    Ah, sorry, I spaced that aspect out. As previously mentioned, using Motor Imagery outputs (left, right, left+right), might be able map to left, right, forward. But doesn't his current setup also allow for speed adjustment, possibly reverse, etc? You may be able to setup some kind of 'hybrid' system by having speed, reverse, etc., controlled by another channel: sip/puff, eye tracking, or even the MindAffect.

    One reason I was harping on the MindAffect, is that it comes with a complete set of tutorials, and works well with 4 channel EEG, Raspberry, etc. You may be able to find Motor Imagery tutorials, but they are less well organized, thought out, detailed, adaptable. Here are some related threads:

    https://www.google.com/search?as_q=motor+imagery&as_sitesearch=openbci.com
    http://blog.jfrey.info/2015/03/03/openbci-motor-imagery/ [tutorial of sorts, uses 8 channel Cyton]

    Not specifically OpenBCI:

    https://www.google.com/search?q=motor+imagery+bci+tutorial
    https://www.researchgate.net/publication/323780357_A_Step-by-Step_Tutorial_for_a_Motor_Imagery-Based_BCI [quite technical and high level]
    https://res.mdpi.com/d_attachment/sensors/sensors-19-01423/article_deploy/sensors-19-01423.pdf [review of MI systems]

    re: driving a wheelchair in public with less-than-perfect Motor Imagery control

    If the MI was solid / fail-safe / reliable, that would be one thing. But I would have serious doubts navigating streets or public spaces. As any mis-fires could cause injury.


    Frankly it's possible you would likely be better off with something like this:

    https://www.google.com/search?q=driving+electric+wheelchair+with+eye+gaze

  • wjcroftwjcroft Mount Shasta, CA
    edited June 2021

    Impressive eye-tracking controlled wheelchair.

    As the video shows, they use a front facing camera, and the display shows 'buttons' overlaid on top of the camera view. Eye gaze tech detects which button you are looking at.

    My hunch is that MindAffect might work possibly also with this camera view idea. HOWEVER, in bright sunlight, that is going to wash out the ability of MindAffect to trigger the cVEP response. Might work indoors though.

  • retiututretiutut Louisiana, USA
    edited July 2021
  • KaseyBearKaseyBear Switzerland

    @wjcroft said:
    David, hi.

    You mention:

    ... I have received the specifications for interfacing with the display like a joystick. My thought is to use a Raspberry Pi, or Arduino, for output to the Omni and use an OpenBCI as input...

    In general, emulating a joystick or cursor control with your own EEG / BCI code, is not easy to do. It has been done in the past, but varying degrees of success. A better approach would be to utilize this exceptional and free open source project, called MindAffect. It can work with Raspberry, and a separate small display (probably display for your employee monitoring, because there are specific reports.)

    https://mindaffect-bci.readthedocs.io/en/latest/
    https://github.com/mindaffect/pymindaffectBCI
    https://www.mindaffect.nl/

    Regarding the headband they use: it works really well and is perfectly adapted to the MindAffect BCI system. It is based on 'saline' or water soaked pads, and a 3D printed band. There are some 'dry' sensor headbands in the Shop page, but the MindAffect band works better because of the decreased skin resistance (impedance) with the wet connection.

    Regards, William

    Hi,William
    Thank you for detailed answer

Sign In or Register to comment.