How to label

I'm using an Ultracortex Mark IV EEG Headset Pro-Assembled Medium 16-channel, in conjunction with a cyton and daisy board. Now I want to use them for data collection, preparing motor imagery: forward, backward, left, right. But at the beginning of each time, I want to label both. For example, when I click a key in the GUI, it will be tagged at the same time. After this is done, I can see it in the raw data file.

I have seen the "External Trigger on OpenBCI Cyton Board", but i didn't get this. Does it require a separate external device and is connected to external triggers? And I didn't have external device.

I found a post said, if want to label when start to imagine actions, can use Brainflow markers, how does it work?

Comments

  • wjcroftwjcroft Mount Shasta, CA

    CyK, hi.

    It is unlikely that you can get motor imagery to indicate FOUR choices. It mainly does left or right. BCI does not work well with 'imagined' commands such as words or intentions.

    A much better BCI for multi-choice selection, is the cVEP called MindAffect. Which is completely open source and free.

    https://www.google.com/search?as_q=mindaffect&as_sitesearch=openbci.com

    External Triggers require wiring up external analog or digital inputs to the Cyton board. An example of external triggers:

    https://irenevigueguix.wordpress.com/2016/05/03/two-external-buttons-on-openbci-32bit-board/
    https://docs.openbci.com/Cyton/CytonExternal/

    William

  • So can I experiment by using mindaffect instead of openbci_gui? Because mindaffect has better results?
    For external triggers, can I use LSL to label the data, and if it can work, need i to input the LSL library to Cyton board?

  • CyKCyK London
    edited September 2023

    Thank you for your help!

  • @CyK said:
    So can I experiment by using mindaffect instead of openbci_gui? Because mindaffect has better results?
    For external triggers, can I use LSL to label the data, and if it can work, need i to input the LSL library to Cyton board?

    And by the way, no need to purchase additional BCI equipment?
    Thank you for your answering!

  • wjcroftwjcroft Mount Shasta, CA

    @CyK said:
    So can I experiment by using mindaffect instead of openbci_gui? Because mindaffect has better results?

    I'm not sure you understand what MindAffect DOES. It is a BCI system that allows you to select various 'buttons' or menu items on a screen. Such as commands for: up, down, left, right, forward, back, etc. Unlimited numbers of commands. You cannot do this with MI or 'thought commands'.

    Suggest you check out some of the MindAffect links.

    The GUI is NOT a BCI. It is for checking out the signals and visualizing. It can also stream out to other apps using the Networking Widget. MindAffect can connect directly to the Cyton using Brainflow. No GUI needed.

    For external triggers, can I use LSL to label the data, and if it can work, need i to input the LSL library to Cyton board?

    LSL is not recommended. Better to use Brainflow if you want programmatic access.

    If you use MindAffect, then IT does all the work for you. No need to classify the incoming EEG data.

  • OK, I will focus on this. I'll keep on it.
    Thank you for your help!

  • Hello, wjcroft

    For mindaffect bci, should I use the GitHub link you sent me to build the environment? Or should I build it through the process of mindaffect’s official documentation website? Because I saw "We no longer maintain and provide support for this repository. The contents are still online for historical purposes only." displayed in GitHub, so which one should I choose?

  • wjcroftwjcroft Mount Shasta, CA

    I did not post a Github link. Not sure what you are referring to. To use MindAffect, go to their own docs site.

    https://mindaffect-bci.readthedocs.io/en/latest/

  • Hello, wjcroft
    Recently I set up the mindaffectBCI on my ubuntu system, but I can not sucessfully set up, is it can work on linux system?
    Or I use a virtual window system in my ubuntu, and to do the BCI work, is it can be worked?

  • This is I tried to test the mindaffectBCI, and I run "python -m mindaffectBCI.online_bci --acquisition fakedata", and it shows like the screenshot shows:

  • wjcroftwjcroft Mount Shasta, CA
    edited September 2023

    Your screenshot above shows that the pip download failed. So clearly any subsequent operations will also fail. The tutorial on installation advises that you should instead install directly from source using the source code zip or git clone. NOT as a pip download or install.

    https://mindaffect-bci.readthedocs.io/en/latest/installation.html#installing-the-package

    If you have further questions, suggest you contact authors via the MindAffect website.


    Unfortunately a disclaimer at the top of their Github, says: "This repository has been archived by the owner on Jul 24, 2023. It is now read-only." So that seems to imply they are no longer developing or supporting this free system. First time I have become aware of this. It could be that they have transitioned to more income producing projects?

    Here is a hint. The primary developer has since left MindAffect and is now working for Meta (Facebook) as of July 2022. So his resume at LinkedIn mentions he still is involved somewhat with MindAffect, but primarily works on other projects now at Meta.

    https://www.linkedin.com/in/jason-farquhar-b4739094/

  • wjcroftwjcroft Mount Shasta, CA

    Further, the type of electrodes you use is highly important. Their system requires better scalp conductivity than can be acquired via 'dry' passive electrodes. See their documentation on the headset they used, which employs a small absorbent pad wetted with water. You might be able to find a workaround with the Ultracortex, if you place similar wet pads under the locations they use.

    https://mindaffect-bci.readthedocs.io/en/latest/fitting_guide.html

Sign In or Register to comment.