Email
Password
Remember meForgot password?
    Log in with Twitter

article imageOp-Ed: Artificial Intelligence analyzes Beatle music progression

By Paul Wallis     Jul 25, 2014 in Science
Southfield - In what may be a major leap in analytics, Lawrence Technological University has correctly identified the Beatles' music progression from their first album to their last, using Big Data technology. This is a real first, and it’s pretty fascinating stuff.
Science Daily:
Assistant Professor Lior Shamir and graduate student Joe George had previously developed audio analysis technology to study the vocal communication of whales, and they expanded the algorithm to analyze the albums of the Beatles and other well-known bands such as Queen, U2, ABBA and Tears for Fears. The study, published in the August issue of the journal Pattern Recognition Letters, demonstrates scientifically that the structure of the Beatles music changes progressively from one album to the next.
The algorithm works by first converting each song to a spectrogram -- a visual representation of the audio content. That turns an audio analysis task into an image analysis problem, which is solved by applying comprehensive algorithms that turn each music spectrogram into a set of almost 3,000 numeric descriptors reflecting visual aspects such as textures, shapes and the statistical distribution of the pixels. Pattern recognition and statistical methods are then used to detect and quantify the similarities between different pieces of music.
This algorithm obviously knows its stuff. It compared With the Beatles to Please Please Me, and Abbey Road, and structured the time frame of the progression from there.
The algorithm also got a very tricky issue right:
"Let It Be" was the last album released by the Beatles, but the algorithm correctly identified those songs as having been recorded earlier than the songs on "Abbey Road."
Even Beatle fans don’t always know that, although musicians usually get it right because of the huge changes and advances in recording George Martin and the Beatles caused. Let It Be was a mishmash of stuff, cobbled together and a few leftovers. (Even then, they cut "Dig" It to a few seconds, the one real jam on the album which might have given a different result.)
It’s obvious to us that the transition from the Fab Four to the later Beatles was a very clear change, musically as well as culturally, but this is the first time that it’s been demonstrated using analytic software. It’d be interesting to apply that process to composers like Mozart and Beethoven, and particularly the very fluid jazz masters like Miles Davis and Mingus.
Also note (terrible pun) that defining musical textures and shapes with pixels isn’t exactly work for the fainthearted. The later Beatles threw out the rulebook and started creating some very new sounds, all distinct, and in context with analytical issues, difficult, sometimes solitary events. So to identify the White Album, for example, you’d have to include totally new guitar, electronics, etc. which didn’t exist at all on the earlier albums.
This sort of analysis isn’t a gimme. If you were to analyze Jethro Tull, you’d get a completely different profile, for example, to Pink Floyd or David Bowie, with totally different elements in the analytical pixel profiles.
Could sell those profiles as art? Interesting thought.
I’ve included "Dig It," the original, which is a famous bootleg, and some other stuff to show the sheer range of sound, and sound quality, regarding textures, that the Beatles produced. It’s a huge progression, with a lot of digressions.
This opinion article was written by an independent writer. The opinions and views expressed herein are those of the author and are not necessarily intended to reflect those of DigitalJournal.com
More about The beatles, Lawrence Technological University, Prof Lior Shamir, Abbey road, With the Beatles
 
Latest News
Top News