Ganglion board now working with python on Linux

Hello,

The last few days I worked on the Python repo, and here it is, finally the Ganglion board can be used directly from python. There's a catch: because I have been using the "bluepy" library, it works only on Linux for now.

Still, if you have the right, OS now you can easily use it in you python programs, use the existing plugins to stream in LSL or OSC, etc. Of course the first thing I did was to stream signals to OpenViBE... and it works great, *at last* I can test this nice piece of hardware!


It is possible to switch-on the AUX channels and the impedance check -- in that sense this implementation might be now more advance than the Cyton one :D. I had to make some changes under the hood (e.g. enable automatic discovery for both ganglion and cyton from the "user.py" interface), in theory nothing should break if you have been using the python API before, my apologies if it is the case.

This implementation would not have been as smooth if it were not for the nice NodeJS code out there, thanks @pushtheworld :)

I hope that in the future one could switch from bluepy to another library which runs on Windows and Mac (if any of you knows one....).

As described in the README, you need a recent version of bluepy, which is included in the repo as a submodule. Hence you need to use "git clone --recursive https://github.com/OpenBCI/OpenBCI_Python.git" to get it all. After that run "make" inside the "bluepy/bluepy" folder. Yes, it's a bit hacky at the moment... and it's not over.

During my tests I encountered sever packet drops (15% of data were missing), I had to tune the parameters of my bluetooth dongle. To do type:

sudo echo 9 > /sys/kernel/debug/bluetooth/hci0/conn_min_interval

sudo echo 10 > /sys/kernel/debug/bluetooth/hci0/conn_max_interval


If you survive these steps, please test it and give your feedback, here or, especially if you encounter bugs, on github.

I only tested on my laptop running Ubuntu 16.04, so I'd be eager to know how it goes with other installations.


Enjoy :)
«1

Comments

  • edited March 2017
    @jfrey i saw and read the source code first and was so happy to see you were able to put that node code to work. I wrote most of the Cyton node code, well originally based on, from your python code so it's full circle :) I love open source. You rock!

    Still waiting for cyton time syncing and impedance checking (now working finally) ported to python :) 
  • Ahah, yes open source is great :)

    The v2 API for the Cyton will come once I will have the courage to update the firmware of my device :D
  • Hahaha send it back and I'll flash it for you!
  • wjcroftwjcroft Mount Shasta, CA
    Fantastic Jeremy! Mentioning @Conor and Joel @biomurph. This would also be a great item for the Community section on the main site.
  • hackrhackr Maryland, USA
    edited August 2017
    [original thread title: What do the columns represent in data captured with OpenBCI Python?]

    I would ask Jeremy on GitHub, but there's no place there for general discussion, so I'll ask here. What do the columns represent in data captured with OpenBCI Python?

    For example, when capturing some test data with my 4-channel Ganglion, the CSV has this content (I read in the data with R, so the column names V1...V10 were automatically generated):


           V1     V2            V3              V4               V5                  V6                  V7  V8 V9 V10
     21.99545  0 -0.07473816  0.025622053 0.11645300  0.239228296  0  0  0  NA
     21.99553  1  0.09672316 -0.034968062 0.05697363 -0.133187179  0  0  0  NA
     21.99556  2  0.03414341 -0.056973632 0.07048028 -0.065451985  0  0  0  NA
     21.99558  3 -0.16342427  0.028073557 0.12382434  0.307541304  0  0  0  NA
     21.99560  4  0.04805397 -0.001075221 0.08577086 -0.001419292  0  0  0  NA
     21.99563  5  0.08750430 -0.062981781 0.05940831 -0.166068377  0  0  0  NA

    Given the strange number of columns, I'm not sure how to interpret this.

    Based on what I'm seeing in csv_collect.py:
    • V1 = time since start
    • V2 = ID's / row numbers
    Beyond that, I'm guessing:
    • V3 - V6 are the 4 channels' voltages.
    • V10 is just junk from imperfectly parsing the file.
    Do you have any idea what V7 - V9 represent?

    Thanks in advance!
  • wjcroftwjcroft Mount Shasta, CA
    edited August 2017
    @hackr, hi.

    Merged your question into this existing thread. You can view the source here,


    The last three zeros are the AUX data channels. Not sure why you are getting the NA, could there be a glitch in your R code?

    Jeremy @jfrey is right here on the forum and he gets an email whenever you mention him. And since he posted earlier on this thread he also gets notified of new posts.

    Regards,

    William
  • hackrhackr Maryland, USA
    @wjcroft Thank you very much.

    I did notice that it said AUX data, I just have no idea what that means. I'm editing the .py and .yapsy-plugin now to allow us to record external events based on key strokes :)
  • Notified I am ;)

    I never used the CSV exporter, by default there is no AUX data with the ganglion, hence the 0's (to enable them: "--aux" option with user.py) , and for the NA... I don't, know, maybe some trailing garbage. The exporter is in ./plugins/csv_collect.py for a closer look.

  • hackrhackr Maryland, USA
    @jfrey Thanks very much. I'm adding a feature to the script that allows us to have a column representing external events, controlled by key strokes during the recording.

    Is that something I should submit to you as a pull request, or is there some other GitHub repo by the plugin's author who I should submit it to?
  • We probable should have a dedicated repository for plugins, but not so many persons were interested in developing some, so your pull request will be welcomed. I wonder how you handle the key strokes, since one could type "/stop" to interrupt the streaming and send new commands... Personnaly, for any events synchronization, I use LSL :)
  • hackrhackr Maryland, USA
    @jfrey Thank you. Yes, I probably need to learn LSL. That's the next thing I'm going to work on.

    I was trying to use the keyDown() function from pygame to handle the event column. I *think* have the code working, but when I go to test it with user.py it doesn't see the pygame module. This is strange. I have pygame in my library. hmmmm
  • wjcroftwjcroft Mount Shasta, CA
    @hackr, a related concept is using the AUX data for external triggers. This is well defined on Cyton but still unclear on Ganglion,



    As Jeremy mentioned, LabStreamingLayer is commonly used for experiment data streaming / merging.

  • hackrhackr Maryland, USA
    @wjcroft Thanks. I'm going to spend this weekend learning LSL. I doubt I will get lucky enough to understand the AUX data for Ganglion, but I will give it a try as well. Cheers.
  • hackrhackr Maryland, USA
    edited August 2017
    Seems LSL doesn't find any BLE devices on Ubuntu with the CSR 4.0 BT
    dongle.

    I wonder if it's using the wrong hci device (the built-in one).
    Strange that it would work fine with Jeremy's user.py but not with LSL.
  • What do you mean it doesn't work with LSL? There might have several EEG board that are supported out of the box within LSL repository, but it is not the case with OpenBCI, you have to go through user.py and use the LSL plugin. Is it what you tried to do?
  • hackrhackr Maryland, USA
    edited August 2017
    Oh, oh yes, you're right I think I did it wrong. I just ran   python openbci_lsl.py --stream and it hung on the part where it's looking for the board.

    Thank you for the clarification, @jfrey . Much appreciated!

    Sorry, one more question - when you run LSL as a plugin, what are you doing with the output? Are you sending it to OpenVibe or MATLAB or something, so that you can record data + events that way?
  • @hackr @wjcroft ;

    I opened an issue for the support of external tigger for Ganglion. Joel and I were just talking about it!



  • hackrhackr Maryland, USA
    @pushtheworld Awesome! It would be a very useful feature to have.
  • Yes, it is nice to see that the boards are not yet at their full potential and keep improving :)

    As for LSL, a stream (or several, depending on the configuration, one for data, one for aux, one for impedance) are created, and it is up to you to do whatever you want with it (acquire it in OpenViBE, Matlab , Processing, Unity, and so on). With LSL nothing is "sent" to a particular address or program, the streams are basically advertised on the network, and anyone can grab them (i.e. cautious if there are several LSL streams from various sources, check their type and name in order to target the one you want).
  • hackrhackr Maryland, USA
    @jfrey My philosophy is that with something like this I should try copying the approach of the smartest guy around, which as far as I can tell is you :) So, I was planning to use the same app as you to consume the LSL data. Based on your website, I believe that would be OpenVibe.

    I managed to build the stable version of OpenVibe on my Ubuntu machine. In the acquisition server it's asking me for a port. You said that Ganglion doesn't have ports, right? Hmmm. Am I going about connecting user.py --add streamer_lsl and OpenVibe correctly?
  • Before jumping on OpenViBE, you should have a look at it -- many example scenarios are included -- in order to see if it suits you. It is meant to do real-time signal processing in order to build a proper BCI, if you "just" want te record data there might be other ways (even though I do use it a lot for recording).

    The port you mention  is probably the one that is used between the acquisition server and the designer, leave 1024 as a default, unless you plan to have several acquisition server. To connect to LSL, select the LSL driver in the list and in its "Driver Properties", select the stream you want. Note that you will probably want to increase the "drift tolerance" ("properties" button on the main window of the acquisition server) in order to accommodate for some of the Ganglion latency, put at least 10ms (BLE packets at 100Hz); if you don't do ERP, put 100ms. Again, play with it and make sure to understand the pipeline before planning any "real" experiment. Good luck ;)
  • Hello everyone!
    I've sucessfully connected the Ganglion Board with OpenViBE through LSL using NodeJS Ganglion.
    I've displayed the signals (in time and frequency domain) and saw that everything is working well. But when I went to "Signal Information" from OpenViBE's window, I found that the sample rate of the signal acquired from OpenViBE is 256Hz, but as we know, the Ganglion Board has 200Hz sample rate. I was wondering if it's some particularity of LSL, or if there's some problem in the OpenViBE cofiguration?
  • @jfrey @hackr

    @yumin has been trying to get LSL with OpenVIBE for a while now and would love any insight y'all have! 

    I have zero experience with LSL and OpenViBE, i just realized the i in vibe wasn't capitalized!

    Great thread so far! Excited to see everyone having successes!
  • hackrhackr Maryland, USA
    @pushtheworld which OS are you using?
  • @yumin is the one doing it, not me! What OS are you using @yumin
  • hackrhackr Maryland, USA
    @yumin Interesting, on the Python library README Jeremy mentioned something similar about the stream_data.py script:
    • stream_data.py a version of a TCP streaming server that somehow oversamples OpenBCI from 250 to 256Hz.

    I wonder if the underlying cause is not the same in that case and in your case?

  • hackrhackr Maryland, USA
    edited August 2017
    @jfrey It's working! I was selecting the OpenBCI driver before, but when I followed your instructions and selected the LSL one, it connected immediately!

    I will stick with OpenVibe for now because I do want to do real-time signals processing, but I'd be more comfortable training my own algorithm in Python or R than having a GUI try to do it automatically for me. I guess that's why some people use MATLAB for LSL, but I'm not really a MATLAB guy (and if I were, it would have to be Octave, the free version of MATLAB). Maybe I can train my own ensemble model in Python or R and then somehow load it into OpenVibe with a little hacking of their code.
  • Hey @hackr I'm using windows 10

    Actually everything is working well, just this point that bothers me
  • OpenViBE (and its complicated CapitaliZATiON :D) should correctly detect the sampling rate of any LSL stream, @yumin where exactly do you see 256Hz?

    stream_data.py is definitely outdated, kinda an ugly hack, that was a time before the Mighty LSL.
  • @jfrey In the basic signal display box of the OpenViBE designer there is an option for Signal Information. Here is my print (https://image.ibb.co/mEbAK5/signal_info.jpg)
Sign In or Register to comment.