BrainBERT Projects .

Technology

BrainBERT

BrainBERT is a reusable Transformer model for self-supervised representation learning on intracranial neural recordings (iEEG).

This is a high-performance, subject-agnostic Transformer model: BrainBERT. It applies self-supervised learning, similar to BERT in NLP, to raw intracranial recordings (iEEG) by pretraining on large, unannotated neural datasets. The core mechanism involves converting neural data to super-resolution spectrograms and using a masked modeling objective to reconstruct missing activity. BrainBERT's contextualized embeddings significantly boost linear decoder performance across multiple tasks—like Sentence onset, Pitch, and Volume decoding—achieving an average AUC of 0.83 versus the best baseline's 0.63. Crucially, it generalizes: the model is effective off-the-shelf for new subjects and novel electrode locations, drastically improving data efficiency for neuroscience and BCI applications.

https://github.com/czlwang/BrainBERT
1 project · 1 city

Related technologies

Recent Talks & Demos

Showing 1-1 of 1

Members-Only

Sign in to see who built these projects