Hello fellow aggregators, and welcome to a celebration of teamwork with data. Rob Mitchum is one such aggregator, and for the past 5 years he has been putting together a spreadsheet per year of all the top 50 AOTY lists released by reputable publications. For this post I did something I don’t normally do, and that’s ask permission to use data, specifically that spreadsheet.
More information can be found on his page, but, essentially, Rob finds all the lists he deems worthy and handscores their ranks into a google sheet. From there, he calculates the average rank for each album (the AVG column), the average rank divided by the number of appearances (WT-AVG column), and a consensus score (CONS SCORE column) which is the average rank of the lists if every non-appearance was given a rank of 75 (an album is thus penalized for every publication ranking it doesn’t appear on). The albums are sorted by that final consensus score and as of January 11, 2018 (which is when I downloaded the data) Kendrick’s Damn. currently stands at number 1.
This very site right here released its own set of end of year album lists so I thought it would be cool to compare our own lists to the individual publications and consensus score rankings. To accomplish this I grabbed the staff top 50 list, the user voted top 50 list, and the rating-based user-usage adjusted top 50 from my year-end rankings post. It wasn’t entirely seamless, I had to change some album titles and band names to fit with discrepancies between the sputnikmusic database (as well as any artifacts from my own scraping of the database) and the AOTY spreadsheet. Another quirk, since there is effectively a loudness war over which publication can release their AOTY list first, I also had to scrape one album from late 2016 (Run the Jewels 3) since it appeared on a few lists for 2017.
(For posterity’s sake, here is the hyperlinked column headers of the AOTY spreadsheet corresponding to each publication’s top 50 album list website link (and the sputnik ones I added): ALTERN, CMPLX, CONSENSUS, CoS, CRACK, DRWND, FACT, GvsB, LINE, MOJO, NME, NOISEY, NPR, PASTE, PFORK, POPMAT, Q, QUIETUS, RSTONE, RTRADE, SKINNY, SPIN, SPUTRATINGS, SPUTUSERS, STAFF, STRGUM, TMT, UNCUT, UPROXX, VNLFCT, WIRE)
So right off the bat, given the consensus scoring, how would the consensus rankings have differed if the sputnik staff list, the user list, and the sputnik user-usage adjusted rankings were included?
new_rank | orig_rank | change | ALBUM |
---|---|---|---|
1 | 1 | 0 | kendrick lamar – damn. |
2 | 3 | +1 | lorde – melodrama |
3 | 2 | -1 | sza – ctrl |
4 | 4 | 0 | sampha – process |
5 | 5 | 0 | st. vincent – masseduction |
6 | 8 | +2 | vince staples – big fish theory |
7 | 6 | -1 | lcd soundsystem – american dream |
8 | 13 | +5 | mount eerie – a crow looked at me |
9 | 14 | +5 | the national – sleep well beast |
10 | 12 | +2 | the war on drugs – a deeper understanding |
11 | 7 | -4 | kelela – take me apart |
12 | 10 | -2 | perfume genius – no shape |
13 | 9 | -4 | jlin – black origami |
14 | 17 | +3 | king krule – the ooz |
15 | 20 | +5 | tyler, the creator – flower boy |
16 | 15 | -1 | fever ray – plunge |
17 | 11 | -6 | thundercat – drunk |
18 | 21 | +3 | slowdive – slowdive |
19 | 16 | -3 | jay-z – 4:44 |
20 | 18 | -2 | big thief – capacity |
21 | 19 | -2 | father john misty – pure comedy |
22 | 34 | +12 | julien baker – turn out the lights |
23 | 22 | -1 | kaitlyn aurelia smith – the kid |
24 | 27 | +3 | arca – arca |
24 | 23 | -1 | migos – culture |
26 | 32 | +6 | paramore – after laughter |
27 | 42 | +15 | fleet foxes – crack-up |
28 | 24 | -4 | julie byrne – not even happiness |
29 | 25 | -4 | bjork – utopia |
30 | 26 | -4 | courtney barnett & kurt vile – lotta sea lice |
31 | 28 | -3 | moses sumney – aromanticism |
32 | 29 | -3 | jane weaver – modern kosmology |
33 | 30 | -3 | hurray for the riff raff – the navigator |
33 | 30 | -3 | j hus – common sense |
35 | 33 | -2 | richard dawson – peasant |
36 | 35 | -1 | stormzy – gang signs & prayer |
37 | 36 | -1 | kelly lee owens – kelly lee owens |
38 | 43 | +5 | queens of the stone age – villains |
39 | 37 | -2 | sheer mag – need to fell your love |
40 | 38 | -2 | jason isbell & the 400 unit – the nashville sound |
41 | 39 | -2 | japanese breakfast – soft sounds from another planet |
42 | 40 | -2 | lana del rey – lust for life |
43 | 52 | +9 | run the jewels – run the jewels 3 |
44 | 41 | -3 | priests – nothing feels natural |
45 | 44 | -1 | aldous harding – party |
45 | 79 | +34 | brockhampton – saturation ii |
47 | 50 | +3 | alvvays – antisocialites |
47 | 45 | -2 | future – hndrxx |
49 | 46 | -3 | (sandy) alex g – rocket |
50 | 47 | -3 | drake – more life |
Adding the three sputnik based rankings would have shot up Fleet Foxes and Julien Baker by more than 10 ranks, put The National and Mount Eerie into the top 10, and even brought Run the Jewels and Brockhampton into the top 50 (assuming RTJ even counts).
Another question I had regarding this data was how similar were the rankings for each publication to the others. The following is a plot of the matrix of similarities between each of the publications’ lists as measured by cosine similarity (the similarity metric I used for the musical neighbors post).
In this plot, the stronger the blue, the more similar the publications’ lists are. Note that the top left blocks are greyed-out because they are redundant with with comparisons in the bottom half of the plot. The plot is pretty, but also busy, so what are the top 20 most similar lists in a much more palatable format?
List1 | List2 | Cosine_sim |
---|---|---|
PFORK | CONSENSUS | 0.499 |
SKINNY | CONSENSUS | 0.426 |
SPIN | PFORK | 0.394 |
SPUTRATINGS | SPUTUSERS | 0.393 |
Q | NME | 0.371 |
SKINNY | POPMAT | 0.352 |
POPMAT | CONSENSUS | 0.351 |
NME | CONSENSUS | 0.338 |
SPIN | CONSENSUS | 0.316 |
UPROXX | CONSENSUS | 0.315 |
STRGUM | CONSENSUS | 0.304 |
POPMAT | PFORK | 0.302 |
STRGUM | CoS | 0.298 |
UPROXX | PFORK | 0.287 |
WIRE | FACT | 0.285 |
CRACK | CONSENSUS | 0.279 |
SPIN | POPMAT | 0.271 |
FACT | CRACK | 0.269 |
UPROXX | NME | 0.264 |
POPMAT | LINE | 0.261 |
The consensus ranking (CONSENSUS) comes up quite a bit in this top 20 which makes sense given that it is aggregated from all the other lists found on the AOTY spreadsheet. The Pitchfork (PFORK) list is the most similar publication to the consensus score, it and Spin Magazine (SPIN) are the most similar publications, unsurprisingly the sputnik user list (SPUTUSERS) is most similar to the sputnik user-usage adjusted rankings (SPUTRATINGS), and (though not represented above) the sputnik staff list (STAFF) is most similar to SPUTUSERS and SPUTRATINGS lists, respectively, followed by The Skinny (SKINNY) and the Consequence of Sound (CoS) lists.
Code for this post, including data, can be found here. I may do another post on this data, and/or the data of previous years. Comment if you have any ideas.
01.16.18
01.16.18
01.16.18
01.16.18
01.16.18
01.16.18
01.16.18
01.16.18
01.16.18
01.16.18
01.16.18
01.16.18