We recently took part in the SONAR music hack day where this year they had a dedicated track on neurohacks. Music Hack Day is an event that coincides with SONAR every year where a bunch of geeks get together to spend 24 hours hacking a music related app. This is generally a mix of music, UI, HMI such as reactable and cloud technologies with some pretty impressive results.
The neuro angle though added a whole new level and we were genuinely impressed with how quickly the hackers were up and running with EEG applications. In total 7 hacks were based on our Enobio platform including 3 of this year’s winners. Some used EEG as a source for sonification algorithms (check out Stephen Barrass for some examples of this), others used EEG to estimate emotional state, at least related to arousal, and others drove self playing robotic drums with a mixture of EEG and ECG.
As people who work in BCI know, EEG is not the easiest signal to work with and robustness is an application developers biggest problem. Our neurohackers took this in their stride and implemented various strategies to create a robust application around an unpredictable signal. It was great to see such pragmatism and creativity thrown at a problem and it reinforces one of our core beliefs that getting the hardware out there will take us in new directions as different people come at a problem in very different ways.
I have felt for a long time that in DSP circles the audio people have a unique and very intuitive understanding of filter design and it would be interesting to see how they would deal with the generally lower frequency physiological signals we are used to looking at.
For our part we plan to SONAR next year and are already thinking about another event which will be entirely focused on neurohacks so watch this space!