| Sputnikmusic



Hello fellow football fans, and welcome to a post that will assign numbers to your footy feelings. On the eve of the 2018 World Cup, I noticed that no one had made a “World Cup Thread”-type list, so I decided to start one. At some point, I realized that I could leverage the comments people were making into some sweet, sweet content. Specifically, I sought to measure the sentiment of each comment (positive or negative) which I could then summarize by World Cup team and by user.

Recently, I was googling sentiment analysis and came upon this post. The post describes and has code for a model that uses the words of tweets to predict the sentiment of each tweet (a sentiment of 1 being positive, 0 negative). The post is from about a year ago, uses tweets as training data not sputnik comments, so it may not exactly match the vocabulary and sincerity level of our own sputnik soccer commenters. Regardless, I fit the sentiment model from the code in that post, scraped the comments from the World Cup 2018 list following the conclusion of the group stages, and then fit the sentiment model to each comment. The model assigns each comment a value between 0 and 1, 1 being positive, and 0 being negative. Most comments lie somewhere in the middle, ~ 75% of comments are between .25 and .75 and ~ 92% are between .1 and .9.

So, after classifying every comment with that model, I searched the…


Hello fellow metalheads and welcome to a post that will put the division symbol in m/. Metal music is not unlike a cult or gang, and any self-respecting gang has its own hand sign. For metal, it has been the metal horns. When you point at someone, you point 3 fingers at yourself; but when you do metal horns at someone, you point two to the ground, two to the sky, and your thumb to the side while you flash them with the European-popular symbol for being cuckolded and the hand symbol of about a dozen colleges. When metal fans post online, that symbol looks like this: m/. Or this: \m/. Or, according to wikipedia, this: \../ or /../.

The humble m/ can be taken as an endorsement of the quality or “metalness” of a piece of metal music; how often an m/ is associated with something is as good a proxy as any for this quality. So what if some brave soul counted these m/’s and disseminated this information to the world?

For this post, I went back to the album data from my Top 250 Users app data, and scraped comment data from the flagged reviews of 593 albums which were those that had any metal genre tag, had greater than a 4.0 average rating, and more than 200 ratings (all the data used to select albums was collected around November of 2016, so this sample is missing albums that have come out since, and that might have reached those criteria by now).…

Hello fellow once-a-year-birthday-havers and welcome to a post that will answer how old people on this website are (or at least those that willingly posted their ages on a sputnik list). So, I surveyed you all to give me more ideas for Statnik, and many of you suggested things I had already done. No worries, the statnik/macman76 cannon is long and it bends toward being sometimes overly detailed and boring and, thus, easy to forget. One thing many of you did not suggest was letting me analyze data you had already collected.

In stepped our hero, Dewinged. He stalked my request list, waiting for the right time to spring on me… that he had asked the sputnik userbase what their birthdays were. So, I scraped the data from his two lists and will definitively answer once and for all what the average age for those 129 users is. And what the median is. And the range. And the standard deviation.

(Data cleaning note: Dewinged seemed unsure of some of the birthdays but I used whatever was listed, I threw out one entry because it involved googling a date, and added the 15th for the day of another since it only listed month and year).

measure value
mean 24.75 years
median 23.59 years
range 33.86 years


Hello, fellow numbers in a weighted average, and welcome to an investigation of how much weight this site really has. It may come as a shock to you or for some reason you’ve never even considered it, but metacritic collects the ratings from the staff reviews of this very website to make their average scores. Specifically, they convert the rating in the reviews, and they scale them to 100 by multiplying them by 20 (i.e. a 4.6 review becomes a 92 on metacritic). Then, for each particular album with more than 4 scores, they calculate a weighted average. Weights for each publication are assigned “based on their quality and overall stature“, and these weights are not revealed to the public. They include this question in their FAQ:


Absolutely not.

I’m fond of weighted averages. For instance, my user-usage adjusted means are weighted averages. They are simple mathematically and conceptually. “Imagine if you got 2 votes and everyone else had 1.” Boom. Weighted average. “Imagine if you got a vote proportional to your wealth.” Boom. Politics around the globe (and a weighted average). The problem with weighted averages (relative to, say, statistical models) is that assigning weights is an arbitrary exercise. For my user-usage adjusted average ratings, I assign a weight of 3 to the count of a user’s reviews, 2 to lists, and 1 to comments


Hello voters, and welcome to a post about a long running music poll. The Pazz & Jop Poll is a year end Album of the Year list scored by votes tabulated from an assortment of elite music critics. In fact, the most elite music critic (outside of PSH’s character in Almost Famous, I guess), Robert Christgau, created it and has voted in it for every instance since its inception — even after he was fired from the Village Voice.

About it’s scoring methodology… without realizing it, I essentially copied the Pazz & Jop scoring system for our very own end of the year user list. Each voter gets 100 points to allot to ten albums, the max an album can be given is 30 points and the minimum is 5 points (which is where P&J diverges from the scoring system I implemented for the user vote, since our minimum was 1 point). The methodology for the best song portion is that each voter gets to pick 10 songs, and the count of mentions decides the ranking (much like our Staff year end song list was scored… or was it?) So, voters of sputnik’s year end poll, you voted just like all the titans of music criticism you looked up to your whole lives, but how similarly to professional music critics did you guys/gals vote?

From 2008 to 2016, Glenn McDonald oversaw the scoring and other analytics for the P&J poll. On his site, he has all the data of each…


Hello fellow aggregators, and welcome to a celebration of teamwork with data. Rob Mitchum is one such aggregator, and for the past 5 years he has been putting together a spreadsheet per year of all the top 50 AOTY lists released by reputable publications. For this post I did something I don’t normally do, and that’s ask permission to use data, specifically that spreadsheet.

More information can be found on his page, but, essentially, Rob finds all the lists he deems worthy and handscores their ranks into a google sheet. From there, he calculates the average rank for each album (the AVG column), the average rank divided by the number of appearances (WT-AVG column), and a consensus score (CONS SCORE column) which is the average rank of the lists if every non-appearance was given a rank of 75 (an album is thus penalized for every publication ranking it doesn’t appear on). The albums are sorted by that final consensus score and as of January 11, 2018 (which is when I downloaded the data) Kendrick’s Damn. currently stands at number 1.

This very site right here released its own set of end of year album lists so I thought it would be cool to compare our own lists to the individual publications and consensus score rankings. To accomplish this I grabbed the staff top 50 list, the user voted top 50 list, and the rating-based user-usage adjusted top 50 from my year-end rankings post. It wasn’t entirely seamless, I had to…


Hello budding data viz lovers, and welcome to a post that will put the division in Joy Division. When it’s time to explain patterns and numbers to layman, it’s usually done with the help of a graphic of some sort. Why? Because numbers are abstract and visual aids are useful tools to make the abstract concrete. But perhaps, too, it has another quality.

A great genre of argument is the “this seemingly boring and mundane activity is actually an art and full of wonders”.  It’s a hit because it’s always true. We humans will assign meaning to anything we spend any amount of time doing. Anything and everything has some great novel/film/etc. about it and it has a r/ page full of memes. So data and statistics are no different. There’s r/DataIsBeauitful, some Neal Stephenson novels, and Moneyball. Without rethinking baseball management, scouting, and talent evaluation via the aid of analytics; how else would Brad Pitt have reconnected with his daughter (or gained a pioneer/icon status that has let him keep his job for so long despite little success)?

Data visualization bridges the gap from something that is obviously artistic, making visual images, to something that is only an art if you explain it, working with data. Thus, data visualization is an art in and of itself, and it’s something that is treated with care and respect in order to enrich the mundane into something full of meaning and import. And since people that work in data visualization can be…


Hello lovers of album art, and welcome to a quick post dedicated to distorting them. Using a model (deepdream) by our favorite overlords, google, and code from a really great tutorial, I have generated some (what I guess I will call) “surrealed” images. Enjoy.

Generally I work in R for the stuff I do here, but in this case used another, even more popular computing language, python. Python is a much more general purpose computing language than R (it currently powers reddit as well as a whole host of other websites and projects). Like R, it is free, and unlike R, google wrote their “deepdream” code in python and made their model to interact with python.

(Install Instructions: To install python, I recommend downloading Anaconda with this tutorial as a guide. It downloads python, as well as many popular packages, and gives you an easy way to install packages. Unfortunately, it is a fairly large download (>300 mb’s). After install you will need to install multiple packages. You can install most, and probably all of them if you open a “command prompt” and type this in the command line

conda install tensorflow, bs4, pil, io, requests, numpy, matplotlib, urllib

Many of those will be part of the Anaconda, but just in case, run the line. Once Anaconda is installed, you can run the IDE packaged with it, spyder, and run the code from my script, found here.)

My relationship with python is much like…


Hello staff/non-staff users and welcome to the first of these posts that will analyze not how sputnik users rate albums, but what they write about them. Reviews are the focus of this site. User written reviews and their prominence are what distinguishes this website from other music database websites. Somehow/someway the reviews on here by the most deft among us were deemed prestigious enough to be sometimes included in the Reception/Release sections of album pages on Wikipedia and aggregated by metacritic. The promise of sputnik, and that of the country of its weary servers, is that if you write enough reviews and if they are good enough, you can be a god among men/(some)women and have a spiffy tag next to your name.

Text analysis has been in the news a lot recently so I thought I would throw my very amateur hat into the ring. To start, I wrote code to scrape the review text from every user listed in the staff page including contributors, staff writers, and emeritus users. Metadata from the review pages, including the number of views and comments, date of review, and ratings of the album were also scraped. If you would like to do it for yourself, go into R and copy the following into the console and hit enter to install the necessary packages:

install.packages(c("tidyverse", "XML", "rvest", "tidytext", "wordcloud", "stringr", "babynames"))

Code to do the scrape is here. If you want to only scrape certain users replace the “do_staff <- TRUE” line to “do_staff <-


Hello users, and welcome to a blogpost detailing a tool to help you rig the sputnik ratings in your favor. Probably a third of users’ comments are related to how much they dislike the average rating of albums (verified fact, obviously). Some albums’ average rating is too high, too low, there are not enough 5’s, 1’s, or the count of ratings is low. Your favorite album may have missed the year end chart; meanwhile, that album that you (and, really, only you) hated was near the top.

Numbers are often used to set incentives in our modern/global/capitalistic society. We gain admission or don’t to the colleges we want to go to based on our test scores and grades, we get fired or retain our jobs based on benchmarks set by our bosses, and we make $10’s of millions or more by breaking the home run record (and a few seasons later follow that up with 120+ intentional walks). But numbers used for evaluation are, generally speaking, adjacent to what they intend to measure. It’s not exactly the case that getting a good grade is the same as usefully retaining course material or that closing a lot of sales indicates one is worthy of employment. (But yes, if you hit 73 homers, you are good at baseball).

“When a measure becomes a target, it ceases to be a good measure.” (Goodhart’s Law). Sometimes, when it suits us, we humans tend to manipulate a measure for our own gain –…

Hello everybody, and welcome to the 2017 charts… sort of. So, I scraped together every album listed in the genre pages of the 2017 charts on August 10, 2017, and calculated the average of the ratings, user usage weighted averages, and their rankings. The user usage weighted averages are described here (idea for the algorithm here); they are weighted averages of the ratings with users with more sputnik comments/ratings/lists/reviews getting more weight in the ratings and an adjustment to them is made so that the count of ratings increases/decreases the ranking of an album (the lower 95% credible interval value, if two albums have an equivalent average rating, the one with the higher count of ratings is ranked higher). Only albums with 20 or more ratings were included and the top 200 albums are below.

(I updated the rankings to now include the users with the most weight, top 5 users are listed in order of most weight and the number in percent next to each user is the share of the vote they have in the UserUsageMean. For context, divide 1 to the number of ratings of each album, and that’s the weight you had before. For Converge, for instance, by voting you would  have ~0.45% of the vote, but, our-boy The Ageless Wonder Rob Lowe, has ~6 times more weight than he had before. That’s LITERALLLY, very cool.)

*Updated as of 9/1/2017

Rank: 1 (Previous: NA), UserUsageRank 25, Der Weg Einer Freiheit_Finisterre
Number of Ratings: 24, Mean: 4.30, UserUsageMean:…


Hello comrades, and welcome to sputnikmusic, the music vertical for the Russian propaganda news site sputniknews.com. Today we will cover a long lost feature of sputnik, musical neighbors. First, anybody that visits this site loves music. Sometimes users come onto this site and celebrate their favorite artists and the albums of theirs that they love. Sometimes they come to trash someone else’s favorite artists and albums but also because they love music. It takes a special love to spend one’s time and mental effort listening to music they know they won’t like, and a special love to come up with a long string of sick one-liner put-downs. We share this love with each other, as fellow chum, in all its forms; but it is true that our love of music can align more strongly with an exclusive subset of individual users: our musical neighbors.

From the list on this topic that I did a while ago, I gather that musical neighbors was a fun, well-regarded feature of the site, even though no one knew what the hell it was or how the hell it worked. (I recall being told that almost everybody was connected to one particular user.) It’s obvious why it was so popular; it’s because it made us feel connected to others and feel bigger than ourselves. It made us feel connected to the whole of sputnikmusic, the music vertical for the Russian propaganda news site sputniknews.com. With this post, I will return that shared love back…


Hello world, and welcome to a first-of-its-kind staff blog, one written by someone with no reviews and pedestrian/almost non-existent music taste. I joined the site when I was trying to find something to fall deeply into, and I thought being the only person I knew that liked Led Zeppelin meant that I could become a SERIOUS music listener. Of course, I failed and, besides a real weak stream of bands I like, I don’t listen to much music. As a result, I stopped regularly visiting the site after maybe 6 months of being a regular commenting member. Nevertheless, I returned to the site because I found a different interest.

Now, I’m not an expert in this kind of stuff. I didn’t actually get a degree in the kind of thing that would make one a said expert. More than anything I’m a diligent and creative googler, the level at which you can fake expertise. I like data. Since I decide I liked data, I have done SERIOUS data guy stuff. I started to keep track of my stats in video games like COD: BO and Rocket League, and I analyzed this here website. (That’s it. There’s not really a third thing, I tried making a tool to help me with a fantasy football draft once, but people were drafting so fast it actually was probably more costly than useful.)

So, as best I could tell, these blogs will go something like this: I’ll write some kind of description of…


Bands: A B C D E F G H I J K L M N O P Q R S T U V W X Y Z

Site Copyright 2005-2023 Sputnikmusic.com
All Album Reviews Displayed With Permission of Authors | Terms of Use | Privacy Policy