sample rate drift or jitter
[original post title: Actual sampling rate around 250.5 Hz??]
Hello,
Hello,
While using the Python software library I noticed something odd: it seems that the board sends *too many* samples.
I have this script to test:
# test_sample_rate.py
import open_bci_v3 as bci
import time
from threading import Thread
# counter for sampling rate
nb_samples_out = -1
# try to ease work for main loop
class Monitor(Thread):
def __init__(self):
Thread.__init__(self)
self.nb_samples_out = -1
# Init time to compute sampling rate
self.tick = time.time()
self.start_tick = self.tick
def run(self):
while True:
# check FPS + listen for new connections
new_tick = time.time()
elapsed_time = new_tick - self.tick
current_samples_out = nb_samples_out
print "--- at t: ", (new_tick - self.start_tick), " ---"
print "elapsed_time: ", elapsed_time
print "nb_samples_out: ", current_samples_out - self.nb_samples_out
sampling_rate = (current_samples_out - self.nb_samples_out) / elapsed_time
print "sampling rate: ", sampling_rate
self.tick = new_tick
self.nb_samples_out = nb_samples_out
time.sleep(10)
def count(sample):
# update counters
global nb_samples_out
nb_samples_out = nb_samples_out + 1
if __name__ == '__main__':
# init board
port = '/dev/ttyUSB0'
baud = 115200
monit = Monitor()
# daemonize thread to terminate it altogether with the main when time will come
monit.daemon = True
monit.start()
board = bci.OpenBCIBoard(port=port, baud=baud, filter_data=False)
board.startStreaming(count)
This is the output once the board is connected:
--- at t: 20.0204308033 ---
elapsed_time: 10.0100939274
nb_samples_out: 2508
sampling rate: 250.547099577
--- at t: 30.0305190086 ---
elapsed_time: 10.0100882053
nb_samples_out: 2508
sampling rate: 250.547242797
--- at t: 40.036908865 ---
elapsed_time: 10.0063898563
nb_samples_out: 2507
sampling rate: 250.539908598
--- at t: 50.0413858891 ---
elapsed_time: 10.0044770241
nb_samples_out: 2507
sampling rate: 250.587811234
It is consistent with the drift observed within OpenViBE acquisition when I use my streaming server. I do not see something wrong with the python library.
I've got the chipkit version (16 channels but no daisy board attached at the moment) and I did not try to modify the firmware.
Could be quite troublesome for signal processing to have a varying / not round sampling rate
Comments
I think I'm recalling that in the design stages, there was the suggestion to use a separate clock oscillator to keep the ADS1299 sample rate very accurate. Joel @biomurph I'm sure will comment here. But was likely dropped due to added board cost and space issues.
Am I reading your program correctly, is it deriving sample rate based on the packet arrival times on the serial port? This does vary somewhat due to OS and RFduino buffering issues.
My guess is that the actual measured 250.xx sample rate will be relatively constant, although offset from the desired exact 250.000 sps. So it looks like the drift adjustments let you set whatever that value is, as long as it is relatively constant?
Umm, isnt it more like, every two seconds -- there is an extra sample to deal with? Is the removal operation sophisticated enough to look at the samples on either side (3 samples total) and do some type of three way averaging? Or does it simply drop the sample?
If it's the latter case, then the artifact generation should not be that serious, I would think. Better than the case if it didnt have 250 samples available every second.
I don't know which way the onboard clock generator tends to be biased. It would be great if it was consistent on all boards. Or does it depend on component values that vary between boards. If it tends to be a constant across all OpenBCI boards, then you could have an extra selection on the menu of "250.55" sps, as kludgy as that sounds(!) :-)
William
Accounting for Timing Drift and Variability in Contemporary Electroencepholography Systems
For some reason, the latency issue makes the sample rate higher!! Like this:
I replicated the error on a windows machine with the latency set to 16 ms and got the results above. Changing the latency then made the output be:
If the large driver latency can affect the calculated sampling rate so consistently as to increase it by 2, could any latency also be causing the small 0.5 increase?
Hi,
See the previous posts regarding the jitter / offset you also see. It's a hardware issue, not software.