OpenBCI + TinyML connection [resolved]
Hi I have a Ganglion Board and would like to connect it to a TinyML device I got recently. I want to pass the Raw signal to the TinyML and classify some signals. I'm planning to use Edge Impulse to design a simple model to classify left or right motor imagery responses. The problem is that I know nothing of hardware, I soldered simple things in the past but I don't know if this project could be possible.
Comments
Hi Rmib,
Apparently one definition of TinyML, is that of TensorFlow Lite, running TinyML-models on microcontroller hardware. So it's not a specific hardware chip, but rather downsizing TensorFlow to run well on small processors.
https://analyticsindiamag.com/how-tensorflow-lite-fits-in-the-tinyml-ecosystem/
Your best bet for interfacing with Ganglion in such situations, would likely be using a Raspberry Pi. This is because the normal size Pi 4 has USB host ports, into which you can plug the Ganglion dongle. Then using the Brainflow libraries, you can read the EEG data into the Pi from Ganglion, and do your ML processing.
https://www.google.com/search?q=tensorflow+lite+on+raspberry+pi
So while you may have "a TinyML device I got recently", unless it has a usb host port that the Ganglion dongle can plug into and be easily used with Brainflow, you are better off with a Pi. Brainflow is not capable of running on Arduino class devices, due to limitations of the C++ runtime needed.
Do note that Motor Imagery with the Ganglion, may be much less user friendly for BCI applications, than the free open source MindAffect system:
https://openbci.com/forum/index.php?p=/discussion/2783/mindaffect-announces-open-source-release-of-their-cvep-bci-speller-games-etc
Why is that? Because Mindaffect lets you describe your own user interface: buttons, functions on the screen. Even a full screen alphanumeric keyboard. Doing this with the very limited binary left/right Motor Imagery technique is just much much less user friendly and orders of magnitude slower.
Regards, William