Master of Science
Dr. Jessica Grahn
Dr. Adrian Owen
This study explored whether we could accurately classify perceived and imagined musical stimuli from EEG data. Successful EEG-based classification of what an individual is imagining could pave the way for novel communication techniques, such as brain-computer interfaces. We recorded EEG with a 64-channel BioSemi system while participants heard or imagined different musical stimuli. Using principal components analysis, we identified components common to both the perception and imagination conditions however, the time courses of the components did not allow for stimuli classification. We then applied deep learning techniques using a convolutional neural network. This technique enabled us to classify perception of music with a statistically significant accuracy of 28.7%, but we were unable to classify imagination of music (accuracy = 7.41%). Future studies should aim to determine which characteristics of music are driving perception classification rates, and to capitalize on these characteristics to raise imagination classification rates.
Sternin, Avital, "Classifying music perception and imagination using EEG" (2016). Electronic Thesis and Dissertation Repository. 3769.