Music personalization algorithms
You know, those funky little dudes that make the world go “1” instead of “0”. The same guys that make your phones apps run the way they were programmed are basically the same ones that make the space rockets follow their planned approaches towards the inevitable discovery of alien life forms. Scary? “0”.
There are so many algorithms in our daily lives, we probably don’t even know how bad it would be without them. There are some that make your daily commute to work smooth, by controlling the traffic lights. Then, there are those that make your playlists full of good music too. They’re just helpful.
Let’s pop the hoods of the three biggest music streaming platforms and the algorithm behind them.
Have you ever stripped an alarm clock to see how it’s made and after removing the first few “thingies” from it, you’ve realized that there is no chance that you could put it back together? (One of us did, blamed his sister) If you are curious like that, we have something for you.
Spotify “Discover weekly” algorithm
With it’s “Discover weekly” is a thing we’ve talked about some time ago. Today, we’ve got the Glitch project to explain how it all works. It’s an app, that lets users “simulate” Spotify’s algorithm inner workings. It has devised several categories like Acousticness, Danceability, Energy, Instrumentalness, Liveness, Speechiness, Tempo, Valence. Each witch a value between “0” and “1” give the algorithm a way of recognizing the tracks and suggesting others, that fit the similar results. The idea doesn’t work 100%, but it gets us all closer to how Spotify does deliver it’s awesome playlists every week. When “Acousticness” is rather self-explanatory (the piece can be recognized as either acoustic or not) the “Valence” might not. We have to explain “Instrumentalness” because of a joke we can’t let go. You see, the program tries to recognize whether the track contains vocals. If the outcome value is “0” that means the piece is just music, no vocals in it.
However, the thing doesn’t recognize the difference between sounds like “ooh” or “…” so it might suggest you one of Rihana’s songs as non-vocal. The “Speechiness” might land you with some cool new trip-hop pieces, only ruined by recorded speech sounds that were supposed to “enrich” the music in an “original” way (just like millions of others) Seriously, why do people do that? “0” idea.
Google Play Music – New Release Radio
This one’s clever. Google music uses machine-learning not just to get to know your music taste. It also learns about your activities and like any other stalker, it does it to provide awkwardly detailed information about you, that you might not have known yourself. In this case, however, it’s all cool. This machine just knows when to approach you with smooth jazz and when you might want some death metal. By checking in with other Google-connected devices (a smartphone with Google Calendar, maybe?) It knows when you’re hitting the gym and that you’re binge-watching “Friends” instead of meeting with your own on Sunday. The app will merely provide you with the music that fits your current mood/situation. Again, scary? Not really. That is of course if you’re not paranoid about constant invigilation. Are you?
Apple Music goes deeper with New Music Mix
The Apple Music’s algorithm is a hybrid of each users listening history with algorithm suggesting new things, based on both recent choices and the analyzed data. They say it’s been their plan since the beginning, but only with the iOS 10, it reached its full potential. We don’t know about you, but we have this strange urge to look up every second word spewed from Apple people’s mouths, just to double check the info. (their definition of “brand new” seems to be the same as ours “the same, but pricier”) Say what you want about Apple, they know how to market their things. They also know how to make their products…well, maybe not “brand new”, but different. Better? “1”? “0”?
If we had to create codes, we would change the industry forever, by introducing the “meh” parameter.