Some music packs a punch the first time you hear it. “Daddy’s Gone”, by the Scottish indie-rock band Glasvegas, came on the radio while I was driving to work a few years ago. I was late for a meeting, just having screamed at my kids in a futile attempt to get them ready for school on time. The tale of a boy without a father in his life, with lyrics about the loss experienced by men who turn their backs on their families, made me realise how ungrateful I was for my own privileged position, and reminded me of the working-class morals of the Scottish town I had left when I was 18. I had to stop the car to cry.
The way we respond to songs is highly personal. My emotions were a result of patterns and connections present only in my own head. So the idea that an algorithm could predict a listener’s emotional reaction to a song might seem rather fanciful. Spotify, the music streaming service, thinks otherwise. Its developers claim to have created an algorithm that can tell the difference between a happy song and a sad one. It’s part of their strategy to create as personalised a listening experience as possible: keeping users on the platform by helping them to discover new favourites is what gives Spotify the edge over competitors including Apple Music and Tidal. The company was quick to realise that classifying music emotionally could result in more satisfying recommendations for its users. If you’re in a happy mood after work on a Friday, you probably don’t want to listen to a gloomy Scottish indie band.
Sorting 30m songs into different emotional categories is a tough task. It is practically impossible for humans to analyse each of these songs individually. Step forward Glenn McDonald, Spotify’s Data Alchemist (he was allowed to choose his own job title). “I’m not searching for abstract truths, I’m trying to find music people will respond to,” he tells me. Every recommendation Spotify gives uses the emotional classification system developed by McDonald and his colleagues at Echo Nest, a startup that Spotify bought three years ago. The algorithm combines musical properties like volume, tempo and – most importantly – energy with “emotional valence”, a measure of how happy or sad a song makes you feel. Generally speaking, high valence sounds make people feel positive emotions, while low valence sounds are associated with negative emotions. To programme the algorithm, human testers listen to pairs of songs and are asked to decide which song made them sadder, or which song sounded bouncier. The human inputs are fed into a computer, to teach it how to analyse songs for itself.
This chart show how Spotify’s algorithm would classify a selection of songs, according to their emotional valence and energy. In the Sad quadrant, the area of the soundscape with both low emotional valence and low energy, we find Regina Spector’s “Samson” and Adele’s “Someone Like You”. In the Happy quadrant we find “Jump (For My Love)” by the Pointer Sisters, Justin Timberlake’s “Can’t Stop the Feeling” and, perhaps unsurprisingly, Pharell Williams’s hit “Happy”. I am taken aback to discover that Spotify views “Daddy’s Gone” as more “angry” than “sad”.
It throws up a notable anomaly: Dusty Springfield’s “You don’t have to say you love me”, in the Happy quadrant, may sound upbeat but listen to the lyrics and it’s a sorry tale of desperation and unrequited love. Spotify doesn’t currently analyse lyrics, so they miss these subtleties. Charlie Thompson, an independent data analyst, decided to fix this by making his own algorithm. His “gloom index” uses data extracted from the music community Genius, which contains lyrics to more than 25m songs. An algorithm detects sad or angry words, like “hate”, “loneliness”, “kill” and “leave” – and combines this information with Spotify’s emotional valence measurement.
I was pleased to see that the algorithm had moved many of my own favourite sad songs – “The Drugs Don’t Work” by the Verve; “Back to Black” by Amy Winehouse and “Suicidal Thoughts” by The Notorious B. I. G. – firmly into the sad quadrant. Songs with very direct messages, such as “Bleeding Out” by Imagine Dragons, are also accurately classified. However, it didn’t pick up on the sarcasm of “Happy Little Pill” or the double negatives of “Can’t Stop the Feeling”. Sadly, it, too, misunderstood Dusty. It’s hard for an algorithm to learn how combinations of words create emotion. McDonald admits Spotify also has yet to solve the lyrics problem. Instead, it is beginning to analyse how people talk about music on Facebook and other social media platforms. By logging which songs are discussed together, its algorithm will get better at tailoring recommendations to certain types of people.
While writing this article, I spent a couple of weeks listening on Spotify to the songs I found truly sad. When I used Discover Weekly, an automated playlist of songs Spotify thinks you will enjoy, the recommendations had become gloomier, but they didn’t have the same emotional affect as “Daddy’s Gone” and “Back to Black”. I expected McDonald to be disappointed when I said I often found myself flipping by song after song, but he told me, “we can’t expect to capture how you personally attach to a song…We do very well at generating playlists for social occasions and the number of skipped songs is low. But if we are trying to suggest a new song to you as an individual then we’re satisfied if you like every tenth recommendation.”
The song that makes McDonald the most emotional is “Home is Where the Heart Is” by Sally Fingerett. “It is the song that has most often caused me to cry while listening to it.” It also happens to be right in the middle of the sad quadrant of his chart. I listened to it and, to start with, I found the lyrics a bit corny. But as it developed I felt tears coming to my eyes. Why? I’ve no idea. Maybe the algorithm knows something we don’t.