Monday, January 29, 2018

How and Where Your Brain Responds to Music Reveals If You Are a Musician

How your brain responds to music listening can reveal whether you have received musical training, according to new Nordic research conducted in Finland (University of Jyväskylä and AMI Center) and Denmark (Aarhus University). By applying methods of computational music analysis and machine learning on brain imaging data collected during music listening, the researchers we able to predict with a significant accuracy whether the listeners were musicians or not.

These results emphasize the striking impact of musical training on our neural responses to music to the extent of discriminating musicians' brains from nonmusicians' brains despite other independent factors such as musical preference and familiarity. The research also revealed that the brain areas that best predict musicianship exist predominantly in the frontal and temporal areas of the brain's right hemisphere. These findings conform to previous work on how the brain processes certain acoustic characteristics of music as well as intonation in speech. The paper was published on January 15 in the journal Scientific Reports.

Brain areas best predicting musicianship. Red: left/right
anterior cingulate gyrus; Green: right inferior frontal gyrus;
Blue: right superior temporal gyrus;
Gray: caudate nucleus, middle frontal gyrus, inferior frontal gyrus.

The study utilized functional magnetic resonance imaging (fMRI) brain data collected by Prof. Elvira Brattico’s team (previously at University of Helsinki and currently at Aarhus University) from 18 musicians and 18 non-musicians while they attentively listened to music of different genres. Computational algorithms were applied to extract musical features from the presented music.

"A novel feature of our approach was that, instead of relying on static representations of brain activity, we modelled how music is processed in the brain over time," explains Pasi Saari, post-doctoral researcher at the University of Jyväskylä and the main author of the study. "Taking the temporal dynamics into account was found to improve the results remarkably." As the last step of modelling, the researchers used machine learning to form a model that predicts musicianship from a combination of brain regions.

The machine learning model was able to predict the listeners' musicianship with 77% accuracy, a result that is on a par with similar studies on participant classification with, for example, clinical populations of brain-damaged patients. The areas where music processing best predicted musicianship resided mostly in the right hemisphere, and included areas previously found to be associated with engagement and attention, processing of musical conventions, and processing of music-related sound features (e.g. pitch and tonality).

"These areas can be regarded as core structures in music processing which are most affected by intensive, lifelong musical training," states Iballa Burunat, a co-author of the study. In these areas, the processing of higher-level features such as tonality and pulse was the best predictor of musicianship, suggesting that musical training affects particularly the processing of these aspects of music.

"The novelty of our approach is the integration of computational acoustic feature extraction with functional neuroimaging measures, obtained in a realistic music-listening environment, and taking into account the dynamics of neural processing. It represents a significant contribution that complements recent brain-reading methods which decode participant information from brain activity in realistic conditions," concludes Petri Toiviainen, the senior author of the study. The research was funded by the Academy of Finland and Danish National Research Foundation.


Contacts and sources:
Petri Toiviainen
Department of Music, Art and Culture Studies, University of Jyväskylä, Finland

Elvira Brattico 
Center for Music in the Brain (MIB), Department of Clinical Medicine, Aarhus University & The Royal Academy of Music Aarhus/Aalborg (RAMA), Aarhus, Denmark

 Citation: Saari, P., Burunat, I., Brattico, E., & Toiviainen, P. (2018). Decoding musical training from dynamic processing of musical features in the brain. Scientific Reports 8, Article number: 708. DOI:10.1038/s41598-018-19177-5

1 comment:

  1. Thanks for the article. I din't knew that there is the way to reveal that you are musician by your brain activity. Thats quite intresting. If you are musitian you can follow to this page. This service can help you with getting listeners on streaming services such as Spotify.Try it for yourself if you have few plays on your new music and finally wanna get noticed.