EEG classification for EEG drones

How do you get your brain waves with Opanci? I'm trying to move a drone with brain waves, but the classification of brain waves doesn't work. I'm writing a paper like this now, but the classification doesn't work. Do you have any good suggestions?

Comments

  • wjcroftwjcroft Mount Shasta, CA
    edited May 2022

    Have you considered a cVEP approach, such as MindAffect? With this free app / system, you could paint a tablet or phone screen with visual 'buttons' for each of your 7 or more 'commands'. Then visual attention on that command button, will evoke the desired response.

    https://openbci.com/forum/index.php?p=/discussion/2783/mindaffect-announces-open-source-release-of-their-cvep-bci-speller-games-etc

    Your virtual button array could also have a natural layout. Such as one cluster showing neutral at the center, then forward / back above below and left / right buttons to the left and right of center neutral. Over to the side in another cluster, you could have your rotate buttons.

    Regards, William

  • wjcroftwjcroft Mount Shasta, CA

    Here is a list of commonly used BCI paradigms,

    https://www.gtec.at/product/bcisystem/

  • Is it possible to classify brain waves by command only by brain waves such as moving the drone forward without using P300 or SSVEP? If possible, I'm currently using openBCI, cyton 16ch. Is there any good example?

  • wjcroftwjcroft Mount Shasta, CA

    In general, no. See the previous link given for common BCI paradigms. As far as I am aware, there is NO BCI paradigm that operates from "thought commands". The closest example is Motor Imagery, but that only gives you left / right. You need 7 or more commands / degrees of freedom. In my opinion MindAffect would be perfect for your application.

    William

  • wjcroftwjcroft Mount Shasta, CA

    cVEP of MindAffect is far superior to SSVEP based BCI.

  • Is it difficult to separate more than 7 commands from the brain waves caused by "exercise"? If so, why do you think it is difficult?

  • wjcroftwjcroft Mount Shasta, CA

    If BCI's operated by significant numbers of "thought commands" were feasible to any practical extent, you would see published papers and commercial systems available demonstrating that fact. Motor Imagery imagination of left or right motions, only gives you two 'commands'. It's not sensitive enough to detect individual fingers.

  • Isn't it difficult to do things like separation of brain waves caused by exercise (for example, right hand, left hand, right foot, left foot)? Besides, it is difficult to separate words by thinking of words in your head (remembering forward), isn't it?

  • wjcroftwjcroft Mount Shasta, CA

    @wjcroft said:
    Have you considered a cVEP approach, such as MindAffect? With this free app / system, you could paint a tablet or phone screen with visual 'buttons' for each of your 7 or more 'commands'. Then visual attention on that command button, will evoke the desired response.

    https://openbci.com/forum/index.php?p=/discussion/2783/mindaffect-announces-open-source-release-of-their-cvep-bci-speller-games-etc

    Your virtual button array could also have a natural layout. Such as one cluster showing neutral at the center, then forward / back above below and left / right buttons to the left and right of center neutral. Over to the side in another cluster, you could have your rotate buttons.

    Regards, William

    Many cVEP systems combine the on-screen 'buttons', with a background image / video showing related status. In the case of a drone or electric wheelchair, the background video would show the view from an onboard camera. Allowing precise alignment of your motion intentions with the real-world current picture.

Sign In or Register to comment.