Tuesday, January 25, 2011

Metacritic Megafails

A blog on why the critical review aggregator Metacritic fails the music world, and why it needs to step up to the plate.


It can be difficult to make a conscious decision whether or not you want to buy an album. (The key word here is BUY, so this blog is obviously not targeted to those who love to take whatever they can get their greasy little paws on and assign their own price of free.) One of the biggest tools to aid in this challenge is a review. Publications pay writers money to give their two cents about a specific album. But a single review is subjective and narrow -- it may not cater to your specific tastes and reasons for liking an album. Just because some guy at Rolling Stones scathes an album by your favorite band doesn't necessarily mean you're going to hate it. If only there was a place that collected the reviews from the masses so you could see what dozens of people had to say about an album...

That's where review aggregators come into play. They take reviews from a large quantity of sources and compile all of them into a standardized score. If the majority of the critics have nothing but lovely things to say, the aggregator will reflect this, and the same is true if all the critics bash an album. Video games have GameRankings and films have Rotten Tomatoes, but even though music predates these other forms of entertainment, Metacritic is the first aggregator to include albums in it's repertoire.

So yay, everything should be sunshine and kittens now that someone is paying attention to music, right? Way ass wrong, actually. We're almost better off with NO album review aggregator.

Metacritic does not cover every album released in a year, and the ones they do include are rarely represented in a fair light. Especially in the case of indie music and heavy metal. The people the run the site hand-select which albums the site will feature, which generally includes all albums from platinum-selling artists, and a bunch of other seemingly randomly selected albums. Once selected, Metacritic with gather on average about 12 reviews to determine how well it was received. How on earth are 12 reviews a representative sample of what the critical masses think about a specific album? And what's worse is that Metacritic deems a measly four reviews to be enough. So if you happen to find four of the most positive reviews available, then an album which may be only average in a more neutral light is deemed king of the music world. And what about publications that don't give a rating, just a long wordy review? Well, Metacritic assigns their own value based on the "general impression".

Now lets talk Metacritic errors. At the end of each year, Metacritic accumulates the top 40 albums with the highest Metascore (excluding reissues and live albums). The list for 2010 can be seen here. The metal world should probably be a little shocked to see Kylesa's Spiral Shadow to hit number 14 on this list. But how can this be? There were PLENTY of metal albums that were far better received critically in 2010 that Kylesa. Underoath's Disambiguation for example comes to mind. According to Metacritic, it scored three points higher. HOWEVER, it seems that only now does the site care about the number of reviews they found, and only albums with seven or more reviews may be included in the best of list. And Underoath had six. They couldn't have found one more review just to include it in the list? Hell, even a negative review would have kept it in the top 40. Or would it...

An aggregator is designed to take the average score from all the reviews. So the score for the Underoath album should be (94 + 90 + 80 + 80 + 80 + 80) / 6 = 84, however Metacritic slapped on a score of 88. What the hell? Well as it turns out, Metascores are weighted, meaning that some review publications are given a higher or lower score than they do at face value. Kinda like that one college class that you SWEAR you failed, yet you got a B in the end. Which publications are given more weight? Well, they won't say specifically, but it's pretty much the ones either they like best or the ones that have "prestige". Presumably this would include music industry behemoth Rolling Stone and the like.

In their defense, they do make a very valid point on their website. They note that for every book, video game and movie there are hundreds of times more albums released in a given year. They estimate about 30,000 albums are released in a given year. And in 2010, they only covered 657 albums (this figure only includes albums that were eligible for the best of list: 7+ reviews, no reissues or compilations). How can one site be expected to collect reviews for all of these albums? My big issue however isn't necessarily with the quantity of albums released, but rather the quality that goes into aggregating them. This should be a highly objective and dispassionate task that should only require someone to simply enter data into a computer. Simple as that. But when you subjectively select albums to be included, subjectively select which review publications should be included, subjectively select which publications hold more weight and don't take the time to seek out a fair number of reviews for each album regardless of its status, you really ruin everything you set out to accomplish.

Sure, 30,000 albums is a lot. Certainly there must be some objective way to narrow that list down. AND certainly there must be more publications out there you can select from. Redacted: Actually, it looks like Metacritic has a pool of almost 100 review publications for albums to select from. Last time I checked there were only like 40ish. So maybe they really are stepping up to the plate now?

Maybe Metacritic's failures and shortcomings will inspire someone to create a new review aggregator that solely focuses on albums!

No comments:

Post a Comment