communicate with locked-in, ALS

ilgarbailgarba Italy
edited June 26 in Software

Good morning everyone!

My name is Nicola (from Italy) and I started to get interested in EEG projects to look for a solution to communicate with my mother, who is sick with ALS (amyotrophic lateral sclerosis) and now in total lock-in (inability to communicate by any means, even with optical communicators as she did until recently).

So I started looking for alternative solutions that would allow at least a YES/NO communication.

I don't have great skills in the field, but I have always been passionate about computer science and electronics in general. A first project was to use a Neurosky EEG headset (taken from the game Star Wars - The Force Trainer II) connecting it to an Arduino board to detect a signal that could somehow be associated with YES/NO. The problem is that this type of headset, very basic, allows only to detect a signal when there is brain activity in general and this makes it difficult to associate, for example, the presence of signal with the YES and the absence with the NO.

As a total ignorant in the field, I am wondering if the boards with multiple channels are able to detect different types of brain waves (I make an example, perhaps silly, to make the idea: thinking about good things activates a type of wave, thinking about bad things activates another type of wave). This would allow to associate the two types of waves to YES and NO.

Can you give me some more information about this?

On the OpenBCI site I see that there are different types of boards based on the number of channels.

  • Ganglion Board (4-channels)
  • Cyton Biosensing Board (8-channels)
  • Cyton + Daisy Biosensing Boards (16-channels)

Would any of these work for my "project"? I don't know if I managed to explain my problem, please ask me any questions that may be helpful to understand better.

Thank you all very much!
Nicola

Comments

  • wjcroftwjcroft Mount Shasta, CA

    Nicola, hi.

    Can you clarify the situation with your mother? You mention:

    ...and now in total lock-in (inability to communicate by any means, even with optical communicators as she did until recently). So I started looking for alternative solutions that would allow at least a YES/NO communication...

    Most application of BCI to assist locked-in subjects, is applicable to those with neurological conditions that prevent their using the body (motor functions) to communicate. So these BCI systems allow the subject to make selections from a screen using their mind alone. See this recent thread for some links and video of an excellent free BCI system:

    https://openbci.com/forum/index.php?p=/discussion/3033/controlling-a-stock-electric-wheelchair#latest

    So when you say "inability to communicate by any means", do you mean that your mother can neither respond to yes/no question, nor answer (for example with eye movement.)

    If that is the case, then it's not just motor functions that have been affected, but rather the conscious awareness of the subject. Who is no longer able to show that they are present.

    https://www.medscape.com/answers/1170097-81868/which-motor-functions-are-preserved-in-amyotrophic-lateral-sclerosis-als

    Regards, William

  • ilgarbailgarba Italy
    edited June 26

    Hi William!

    First of all thank you so much for your reply!

    When I talk about not being able to communicate, I mean that at the moment my mom is perfectly able to understand what we are saying to her, she just has difficulty in responding because she can now move her eyes very very little (mainly horizontally to the right and left, almost nothing vertically).

    We are currently using this little movement to get her responses. It takes a long time, but eventually we succeed. Fearing that this movement will be further reduced, I'm trying to understand if by exploiting the brain waves we can, in some way, find a new "form of communication" that allows her, thinking or concentrating in a certain way, to stimulate a type of brain wave.
    Intellectually, my mother is absolutely fine at the moment. It is all the motor part that is compromised.

    I don't know if this gives a better understanding of the situation, ask me any questions that may be helpful in understanding better.

    Thank you very much!

    Regards,
    Nicola

  • wjcroftwjcroft Mount Shasta, CA

    Nicola, thanks for that clarification.

    Did you look at the video on the page I previously linked? Your mom 'should' be able to use that to make choices shown on the screen. You could have simple yes and no (squares / 'buttons') on the screen. But the BCI shown is capable of much more than that, such as typing letters on the onscreen keyboard, by focusing attention on the letters requested.

    https://openbci.com/forum/index.php?p=/discussion/3033/controlling-a-stock-electric-wheelchair

    The MindAffect BCI does not depend on eye motion, but it does depend on the user placing their 'attention' on the choice (square / 'button') they are intending to select. If your mom eventually has no ability to focus on selected visual objects (because they are out of her field of view), then that may impact the MindAffect performance. But I would assume in that case you could always fall back to two simple yes/no boxes, close enough together that she could focus on either easily with no eye movement.

    Regards, William

  • ilgarbailgarba Italy
    edited June 26

    Thank you so much William! You don't know what a pleasure it is to have read this response from you.

    I saw the video you posted and it is really very very interesting for my mom's needs.

    I'm sorry to take advantage of your kindness, but I would like to ask you some more indications to understand where to start.
    From what I understand, I could use the opensource software MindAffect to receive signals from EEG sensors and associate them to the screen, for example to the SI/NO and, perhaps, also to other functions. Is it possible to pass commands to Arduino through MindAffect? I'm asking because currently, through Arduino (for example by executing files with Python scripts that send commands to Arduino), she can ring the doorbell to call us and start other medical devices useful to her.

    If what I've written is correct, I ask these questions:
    1. What kind of boards, EEG sensors, etc. should I buy? Do you have any guidance on this?
    2. Is what is used in the video a "complete" product or do I buy the various components on OpenBCI Shop and then proceed with the assemblies and programming myself?

    As I was reading in David Demland's post from your link, even in my mom's case, being bedridden, there may be some difficulty with the 3D printed headgear because of the sensors on the back of her head. I assume they need to be placed at specific locations on the head though, right? Do you have any suggestions or material that I can read/study in this regard?

    Thank you again so much for your helpfulness and kindness!
    Thank you very much and see you soon!

    Bye,
    Nicola

  • wjcroftwjcroft Mount Shasta, CA

    re: Arduino output.

    My suggestion is that you follow closely the recommendations on the MindAffect tutorials. They have used various platforms: laptop, iPad (video only?), Raspberry Pi, etc. The Pi may be a good match for your situation because it can output GPIO logic output levels. As well as interface to usb devices, small displays. The Arduino is underpowered for the signal processing needed.

    re: boards. Some of the MindAffect tutorials show examples using the Ganglion board. I suggest you use this with the MindAffect headset / headband design. This is a wet electrode setup using small sponges (water or saline). You may be able to find out how to get this printed in your area; you can also email MindAffect to see if they have a reseller. Cost should be low, even less than the Ganglion. Read through all the MindAffect tutorial material.

    I'm assuming since you already use Arduino, you would be able to setup the Raspberry. The MindAffect software allows you to design the screen interface 'buttons' to taylor those to your needs.

  • Thank you so much for your response!
    Do you think a 4 channel board is sufficient?
    Sorry for the ignorance, but in concrete terms having more channels available what does it allow you to do more? Detect different waves or always the same waves but in more points and therefore more accurate?

    Also, since my mom is bedridden, it may make sense to place the eeg sensors on the front and side, but not behind? Or would the waves not be detected properly that way?

    Thanks again for your kind and very helpful answers!

    See you soon,
    Nicola

  • wjcroftwjcroft Mount Shasta, CA

    Do you think a 4 channel board is sufficient? Sorry for the ignorance, but in concrete terms having more channels available what does it allow you to do more? Detect different waves or always the same waves but in more points and therefore more accurate?

    MindAffect has used Ganglion in the past.

    https://www.google.com/search?q=mindaffect+ganglion

    Also, since my mom is bedridden, it may make sense to place the eeg sensors on the front and side, but not behind? Or would the waves not be detected properly that way?

    No, you should use their headset band, which positions the 4 channels in the rear of the head. That is where the visual cortex is, area of visual stimuli response in the brain. I would think optimally you would have a bed or upper back support which allows the subject to orient their head and eyes towards the display. If they are completely prone on their back, that would then cause the weight of the head to press down on the electrodes, which may influence accuracy of the EEG. If there is any head / body motion, that could appear as artifact in the EEG. I'm not saying prone on the back would not work, but none of the MindAffect demos show this position. MindAffect could answer this question.

Sign In or Register to comment.