Decoding the memorization of individual stimuli with direct human brain recordings.
Academic Article
Overview
abstract
Through decades of research, neuroscientists and clinicians have identified an array of brain areas that each activate when a person views a certain category of stimuli. However, we do not have a detailed understanding of how the brain represents individual stimuli within a category. Here we used direct human brain recordings and machine-learning algorithms to characterize the distributed patterns that distinguish specific cognitive states. Epilepsy patients with surgically implanted electrodes performed a working-memory task and we used machine-learning algorithms to predict the identity of each viewed stimulus. We found that the brain's representation of stimulus-specific information is distributed across neural activity at multiple frequencies, electrodes, and timepoints. Stimulus-specific neuronal activity was most prominent in the high-gamma (65-128 Hz) and theta/alpha (4-16 Hz) bands, but the properties of these signals differed significantly between individuals and for novel stimuli compared to common ones. Our findings are helpful for understanding the neural basis of memory and developing brain-computer interfaces by showing that the brain distinguishes specific cognitive states by diverse spatiotemporal patterns of neuronal.