Email
Password
Remember meForgot password?
    Log in with Twitter

article imageNew algorithm allows for potential 'brain-reading'

By Tim Sandle     Mar 11, 2018 in Science
Rio De Janeiro - A new algorithm paves the way for potential 'brain-reading'. This is through machine learning technology which can identify musical pieces from fMRI scans of the listener.
The new algorithm, which can can identify pieces of music from a listener’s functional magnetic resonance imaging data, comes from scientists based at the D’Or Institute for Research and Education (Rio de Janeiro, Brazil). The research is of wider importance since encoding models enable scientists to undertake explicit assessments of different theoretic models or representations in the brain. In turn this can lead to developments with brain–machine communication.
A brain–computer interface is a direct communication pathway between an enhanced or wired brain and an external device. This type of emerging technology creates a mutual understanding between users and the surrounding systems, and there are potential uses with educational, self-regulation, production, marketing, security as well as games and entertainment.
The scientists developed the method, which combined encoding and decoding, as a two-stage approach. This involved mapping the brain responses elicited by listening to the music, according to the website Biotechniques. This was followed by using the gathered information to identify novel musical pieces based on functional magnetic resonance imaging data alone. Functional magnetic resonance imaging is a method to measure brain activity by detecting changes associated with blood flow. This approach relies on the fact that cerebral blood flow and neuronal activation are coupled, meaning that when an area of the brain is in use, blood flow to that region similarly increases.
The research means that by interpreting the correct mapping of musical features to the brain researchers can predict and decode any novel musical piece. The model was based on analyzing six participants who listened to forty different pieces of music. Through this the algorithm encoded the listeners’ functional magnetic resonance imaging responses for individual pieces of music, assessing various musical features like tonality, dynamics, rhythm and timbre.
The research has been published in the journal Science Reports, with the associated paper titled "Identifying musical pieces from fMRI data using encoding and decoding models."
More about machine learn, brain reading, Algorithm, Music
More news from
Latest News
Top News