A Unity-powered system that transforms EEG signals and live audio into reactive VFX. Blending neuroscience and entertainment, Brainrave turns brain activity into a live visual performance.

As Co-Founder of Brainrave, I helped design a system that blends neuroscience and immersive entertainment.
It captures live EEG data and audio, translating them into reactive Unity visuals. Shaders, particles, and 3D brain models that move and morph in time with both music and thought.
At its heart, Brainrave is about taking invisible signals. Brainwaves and beats. And making them visible.
EEG data flows in through BrainFlow, while a custom Rust module analyses live audio streams. Both are routed into Unity where an adaptive VFX system ties neural states and music together into one synchronised performance.
This isn’t just visuals for visuals’ sake. Brainrave explores what happens when a performer’s inner state becomes part of the show.
By turning neural data into shared art, it connects audiences with the emotion and energy of live music in ways that feel both novel and surprisingly human.
Brainrave started as an MVP, but its roadmap reaches further:
The EEG Visualiser is just the first step. A proof of concept that shows how technology, music, and the mind can merge into something performative and communal. For me, it’s not just a technical challenge, but a glimpse of how entertainment might feel when our thoughts are part of the stage.
EEG signals streamed via BrainFlow, processed in Python for smoothing, filtering, and real-time normalisation.
Custom Rust module detects beats and audio onsets with millisecond accuracy, synchronised with Unity over UDP.
Unity VFX layer with shaders, particles, and mesh deformation driven by modular adapters for rapid creative iteration.
Let's work together to create something equally amazing for your next Unity project. Whether it's VR, EEG visualization, or game development, I'm here to help.