Faculty
Science/Schulich Medicine & Dentistry
Supervisor Name
J Bruce Morton
Keywords
Brain Entropy, Resting-state fMRI, Brain-States, Information Theory
Description
Brain entropy is a measure that has been increasingly studied in neuroscience over the past decade. It is based on Shannon Entropy, a measure from Information Theory that quantifies the information capacity of a system from the probability distribution of its states. Brain entropy is thus posited to reflect the information capacity of the brain and has been linked to various cognitive abilities and states. However, most studies on brain entropy measure the time-series entropy of each voxel independently, ignoring any patterns that emerge from the relations between voxels. Here, we measured brain entropy of resting-state fMRI data based on recurrent patterns of activation across the cortex and found that this was positively correlated with time-series entropy, suggesting that entropic voxels are indicative of large, complex repertoires of brain activation states. That said, we also found that time-series entropy measures are sensitive to frequency, as low-frequency power decreases entropy. This is potentially problematic for periodic signals, as all sinusoidal oscillations are equally predictable, and high-frequency oscillations should not be considered more entropic than low-frequency ones.
Acknowledgements
Thanks to Dr. J Bruce Morton and the CDNL, Dr. Mark Daley, and the Western Institute for Neuroscience for their support.
Creative Commons License
This work is licensed under a Creative Commons Attribution-Noncommercial 4.0 License
Document Type
Poster
Included in
Entropic Voxels Indicate Large Brain-State Repertoires
Brain entropy is a measure that has been increasingly studied in neuroscience over the past decade. It is based on Shannon Entropy, a measure from Information Theory that quantifies the information capacity of a system from the probability distribution of its states. Brain entropy is thus posited to reflect the information capacity of the brain and has been linked to various cognitive abilities and states. However, most studies on brain entropy measure the time-series entropy of each voxel independently, ignoring any patterns that emerge from the relations between voxels. Here, we measured brain entropy of resting-state fMRI data based on recurrent patterns of activation across the cortex and found that this was positively correlated with time-series entropy, suggesting that entropic voxels are indicative of large, complex repertoires of brain activation states. That said, we also found that time-series entropy measures are sensitive to frequency, as low-frequency power decreases entropy. This is potentially problematic for periodic signals, as all sinusoidal oscillations are equally predictable, and high-frequency oscillations should not be considered more entropic than low-frequency ones.