Prev

Next

Page 40 of 46

Posted on Mon Apr 28 2025

Last updated Wed Apr 29 2025

This morning I was able to connect to my OpenBCI Ganglion and record brain activity sessions. I was also able to run an impedance check on the Ganglion input channels to test the quality of the signal.

My next step is to build the db models and create controllers to update the db with the new data.

I didn't expect to see so much data. The Ganglion emits about 105 "packets" of data every second.

That's 31.5k packets of brain activity data for a 5min recording session.

That's a lot of data compared to the heart activity emitted by the Polar H10. I'm not sure about this, but it's likely that it sends 1 packet per second.

I'll have to look more into the storage of the brain activity data. I don't know much about postgresql read/write limits and storage limits. Not sure the read/write will be an issue since these writes don't have to be immediate and can happen in bulk or in chunks.

But still. Curious to know how many 5min brain activity sessions my database can handle.


After a bit of reading about psql, it seems like creating and getting 30k records is not at all an issue. Don't know the limitations, but psql can easily create and resolve queries with millions of records.

A 5min brain activity session at 105 samples / second will result in about 900kB of data (according to Claude).

I also learned that it's best to store the raw binary data from the Ganglion stream. That way, I can play around with different post-processing functions when analyzing and visualizing the data.

Which makes me realize... I will also store the binary data for heart rate recording sessions.