Hello voters, and welcome to a post about a long running music poll. The Pazz & Jop Poll is a year end Album of the Year list scored by votes tabulated from an assortment of elite music critics. In fact, the most elite music critic (outside of PSH’s character in Almost Famous, I guess), Robert Christgau, created it and has voted in it for every instance since its inception — even after he was fired from the Village Voice.
About it’s scoring methodology… without realizing it, I essentially copied the Pazz & Jop scoring system for our very own end of the year user list. Each voter gets 100 points to allot to ten albums, the max an album can be given is 30 points and the minimum is 5 points (which is where P&J diverges from the scoring system I implemented for the user vote, since our minimum was 1 point). The methodology for the best song portion is that each voter gets to pick 10 songs, and the count of mentions decides the ranking (much like our Staff year end song list was scored… or was it?) So, voters of sputnik’s year end poll, you voted just like all the titans of music criticism you looked up to your whole lives, but how similarly to professional music critics did you guys/gals vote?
From 2008 to 2016, Glenn McDonald oversaw the scoring and other analytics for the P&J poll. On his site, he has all the data of each voter entry, the total and individual scores for each album, each artist, each song, and an assortment of other analytics measured from the poll data. The derived metrics are described in the sidebar on the site, and I will say that one of my favorites is metalism which is the percent of points that a voter gave to metal albums (we need an electronica-ism metric so those votes can feel nice and properly excluded, too). He would also write companion pieces about the data for each year’s poll (here, for instance). (He’s also not a bad creative writer himself.) Unfortunately, when I contacted McDonald about this post, he informed me that he is not hosting the P&J data for this year and possibly beyond. Damn you Village Voice and your 20 readers for depriving us of DATA!
For this post, I scraped the score data for all the voters of the 2016 list to compare to our own 2017 end of year list. I scraped the 2016 data since the 2017 data, as I mentioned before, is unavailable. I wrote code to scrape data from Glenn’s website, but, at his request, I’m not going publish that code (though it was finely written, something I can say since it won’t see the light of day).
The 2016 P&J data is made up of 532 voters that completed full ballots (10 albums and 100 points assigned). The Sputnik poll data is made up of 178 voters that completed full ballots (10 albums, 100 points assigned, cleaned by Jom). For the following analysis, I removed voters that broke the rules of the polls, including 2 Sputnik voters that just barely snuck through Jom’s watchful, sensitive eyes. The script for this analysis and csv’s of this data are available on github here. Note that while I did initially scrape voter names and the albums they voted for (because I totally can and am a card-carrying BBB Big Baller as a result), they were dropped for the P&J poll data I published on github as a concession to Glenn since he wasn’t so down with my endeavor of scraping the data.
So for starters, how similar were the point allotments by Sputnik voters versus the P&J voters? Well, pretty similar.
Both P&J voters and Sputnik voters love the base 5 values, have the 10 point allotments as the most likely, have the 5 point allotment a close second, and contain large drop-offs for the 15, 20, 25, and 30 point allotments. P&J voters were, however, much more constrained with what points they could assign, and it’s thus possible that they favored 10 point allotments a lot more than the Sputnik voters as a result (nearly 50% of all the points were 10 for P&J versus ~25% for the Sputnik voters).
A pattern I noticed when I was tabulating our own poll is that people often fell into one of two categories regarding their scores: they either voted with all 10’s or all base 5 numbers. Math makes our heads’ hurt and assigning points to albums to judge them objectively is obviously silly (but please continue to do it in the future), so a common tactic voters seemed to take was to not bother with wasting time finely judging how much one album deserves over another (giving all 10’s), or to just give points that are easy to add up since this is all arbitrary anyways (base 5’s). So how many voters in each ballot fell into one of those two categories (all10s and base5, respectively)? While we’re at it, how many people tried to be creative and assign an actual rank order to their albums by giving a unique point value for each album (all_unique), and how many used no base 5 points (no_base5)?
The P&J voters had substantially more “all 10” ballots versus the Sputnik voters, but about the same relative amount of everything else. Did the crushing knowledge that 100’s of people were going to vote on this dissuade the P&J voters from creativity? Did they think that the act of assigning an album a 16, for instance, would dissolve into the ether as something of an unimportant rounding error? Were they jaded by years of filling out the ballots only to see that their votes for all those jazz fusion albums lead to last place finishes anyways? Were they protesting their slow march towards death by spending as little of their time on their ballot? Glenn, himself, was not one for giving boring 10’s. Grand Poobah Bobby Christgau wasn’t a boring scorer either.