Tuesday, March 31, 2009

Dysconnect, March 2009: Decoding Recoding [reviewing reviewing]

A couple of months ago, music reviewing reached crisis point. I wouldn’t blame you for having missed it – it barely made a sound. Or nobody was listening. Or everyone had their headphones on. And there was so much background noise.

What happened? While the desks of music editors the world over were (are!) overflowing with promo CDs nobody can be bothered listening to, people at home are deleting music… only slightly slower than they’re downloading it. Meanwhile, online websites continue to publish so-called reviews: sometimes carefully written, mostly barely skimmed, and, almost invariably now, with only the most tenuous relation between the rating and the reality.

A now infamous example of this is the 9.6 that Pitchfork doled out for Animal Collective’s new album. During a backyard ping-pong match the other week, I asked members of three of Melbourne’s indie bands – the kind of bands who, shall we say, might feasibly get an eight-point-something-or-other on said site – what they thought about the rating. Standing incredulous atop a damp underpile of envy was Mr Disbelief: how, indeed, is it possible for anyone to get a 9.6? Somehow, they agreed, 9.6 was much more than even the ‘five stars’ a top-rated film might receive. The new AC album was good, but… 9.6 had something impossible about it, something parodic. ‘Cept no-one was taking the piss. Pitchfork really meant it… they even thought it was defensible. No, more than that: they thought it was a 9.6.

The reasons for the seemingly inexorable climb toward the banality of near-perfect scores is doubtless more complex than the following, which nonetheless needs re-stating: a) music writers glamouring/clamouring to be the first to call ‘Album of the Year’ (by mid-February, if possible); b) music writers attempting to accumulate social capital by upping the impact factor/hit count for themselves and the site/blog they’re writing for by churning out an attention-grabbing review; c) the kind of aesthetic circle-jerking that transpires when the music in question is only reviewed by people whose cool is intimately bound to loving or hating the recording under review; d) the fact that reviewing is usually the ‘job’ (if, indeed, anyone is getting paid to do it) given to entry-level wannabes who, after struggling and failing to write a measured review that retains some nuance and ambivalence, opt for odes and boost the fuck out of whatever it is; e) the fact that writing a compelling review that nails the recording is actually really, really, really difficult. Especially if the conclusions reached in the course of writing mean that the rating ends up having to be three stars.

Accelerating, undermining or otherwise affecting all the above factors is the datasea: reviewers, drowning in mostly worthless promos, are expected to make watertight judgement calls about recordings they’ve only had time to listen to a few times (because Tuesday is already a week ago, and now there are already nine other new albums vying for your attention). The effect on the receiving end of the datasea (with its daily tsunamis microwaving their way toward a desktop near you) is that most people don’t take the time to read reviews carefully, negating the rewards of investing careful consideration into nuance. So what’s a reviewer to do? Tip, boost, high-five, move on, repeat.

I’m one of those old-fashioned people who believe that a culture of carefully considered, trustworthy reviews is something worth retaining, and I don’t think this has anything to do with the kind of vulgar ‘gatekeepers of high culture’ arguments I’ve heard advanced against the current situation. This is not least of all because there is no High Culture anymore: tastes and audiences have fragmented and multiplied to the nth degree, while there is less than ever before (in terms of cost and distribution) keeping you from accessing and enjoying your preferred peculiarity.

What to do then? If you accept that there is something desperately wrong with the state of music reviews, and you believe that there is a place for key sites and writers to filter and evaluate new releases, then consider the following…

1) Descriptive Previews: these short pieces work on the basis of impression, are primarily illustrative, and presume that the audience hasn’t heard the recording. They are only intended to provide a clear idea about what the recording actually sounds like (and not just to a first-year lit. major), and make no attempt to rate or otherwise evaluate the music in question. Descriptive previewing is a matter of decoding the recording.

2) Critical Reviews: these longer pieces make evaluative judgments about subjective importance or merit. They should place the recording in the broader context in which it was released, and address the general critical reception of the recording by its audience. One of the key functions of the critical review would be corrective, by up- or downgrading the impressions made in earlier previews, correcting ambient hype levels (now that the tipster apparatus has peddled its fixed-gear bicycle to the next cool thing) and rescuing poorly received, unheard or generally ignored releases from obscurity. Critical reviewing is a matter of recoding the recording.

In a way, the above suggestion has got the same whiff of futility as the ‘legislating against capitalism’ rhetoric you see Western politicians engaging in at the moment in order to appear clear, strong, and decisive in the midst of successive spasms of economic crisis that no-one has control over… but like politicians, we tend to stridently overestimate our ability to control things while timidly underestimating our ability to influence things. Being grand, saying a sovereign no, making definitive statements and issuing ultimatums… all completely useless. But they’re all actually much easier than slowing down, tuning in, and trying to give a fuck about what is being said about what has been heard. If only we took the time to write carefully, to read carefully and to foster this culture of warm engagement (as opposed to the dominant culture of cool disengagement), we’d all be making an infinitesimal difference, and that adds up. But not to a 9.6. Maybe a 4? Or a B-? Let me spend some time with it, and I’ll get back to you with something more precise.



  1. it's funny that you mention this now, because i was wondering where all the fives went on RA; i counted only one 5 this year (single or album), though plenty of 4.5s.

    this is why i think numbers or grades on reviews should be done away with completely. whenever i have tried to review something any number or grade is completely arbitrary. it's the text which is important; the number will always be wrong. i like the reviews on LWE because you have to read it to know what the reviewer is thinking.

    that said, i lauded the new animal collective as one of the best albums i had ever heard at the time of its release, so i shouldn't be talking (it's a damn fine album, but that's a bit much).

  2. @ Minimill:

    Yeah, agreed on A Collective's album... BUT...

    as for RA, I'm partisan and a 'stakeholder' (as they say) so I'll refrain from commenting on that, sufficed to say I think Todd Burns is an outstanding music editor and understands the field he's working in very well, better than most.

    ...for me, Sean Cooper's old reviews of electronica on All Music Guide (the old one, before it started sucking a fuck) were always great, and so reliable. They were great critical reviews; they gave clear, accurate descriptions as well as critical evaluations of the material. I really appreciated that, and I miss it.

    ...also agreed on LWE: my issue is that I'm often trying to filter out EPs, but this is a bad habit. I should be open to everything, but these days I usually have to 'recognise' it a few times before I pay any heed. So many panflash classics...

  3. Stimulating, well-writen article PC. I can't help pointing out the 'cool disengagement' that so much music inspires in me though, despite all my attempts at 'warm engagement'. Joris Voorn's Balance mix being a prime example...

    I think the most useful thing a reviewer can do is describe the emotional impressions an album/ track made on him/ her, and why. 'Contextual description' if such a phrase makes sense, and just do away with these ridiculous percentage/ out-of-five rating system.

    @ minimill:

    Yes, it would appear there's definitely been an executive decision to tone down the 5/5 reviews over at RA - I just see the 4.5's as the old 5's. It's still not terribly useful until I at least read the review.

  4. @ Joe: I interviewed Voorn a few months ago and asked him point blank how he can justify asking consumers to fork out 30+ dollars for a mixed CD given the ubiquity of high quality podcasts, and he said that it was a reflection of the enormous amount of time he put in to making it - six months or so.

    ...but what if people aren't going to listen to it 'for six months', but rather just as they would another podcast...?

    ...maybe we should change the way we record? Maybe recordings should go back to being recordings of live appearances, each time different... isnt' there an Ani DiFranco song about that (egad)?

  5. I totally agree here. Sometimes I also have the feeling while reading the reviews , especially on RA, that reviewers don' t know what they are talking about.
    It seems that most of them give good marks to tracks/albums that are danceable and bad ones (under 3,5) to tracks/albums that sound old school or analog or, who are not the peak time killer tracks...

    As an exemple:

    This review is a joke and it's just one on a million.

    But come on, who buy a record here because a reviewer said: "this is great" or "this is bad"

  6. PC, I agree with your assessment of the Pitchfork review of Animal Collective, but I think I've seen worse. Take today's Skull Disco review... 6.9. Yea ok fine, reviewer thinks the Skull Disco approach is producing diminishing returns and some of the remixes aren't to his/her (Jess?) taste... that's pretty valid even if I disagree. Heck, maybe with the 6.9 he's just trying to sneak a juvenile reference to sexual positions in there (like I just did) or an allusion to the astrological symbol for Cancer for karmic purposes.

    But at least with the AC review you could compare that 9.6 to Pitchfork reviews of other Animal Collective albums. For all the fuss the reviewer makes about past Skull Disco releases, that 6.9 exists in a fucking void, this is the first Skull Disco release they've reviewed! To finally review a label's release just as they are calling it quits only to dismiss it compared to the label's past work is absurd enough. To then numerically imply that it is 2.7 ptchfrkHZ worse than Animal Collective... it doesn't sit well with me.

  7. this is a long one, feel free to ignore it

    1) i just want to point out that while you're right about a 9.6 seeming unattainable, pitchfork has given out tens before, most recently to histoire de melody nelson (a reissue, though), but also to radiohead (both kid a and ok computer, in fact), wilco, neutral milk hotel and many others (also plenty of undeserving 10s, such as boards of canada).

    anyways, i'm trying to say that i hear you about music reviews, but i think that it's not the number 9.6 that should be worrisome but the fact that it's there; the fact that it's the lasting impression of the review. i think you mentioned it, but the number really only serves as a talking point: it's much easier to say "hey, can you believe it? the new ac got a 9.6 on pitchfork!" than it is for one to summarize a review themselves and then talk about that.

    therfore, i like your idea about reviews. in place of the grade should be a short sentence or two describing what the album simply sounds like and who may like it. then you can write a dissertation of your thoughts about the album and if it's any good.

    2) about mixes:
    i would hate for the idea of carefully crafted mix cds to disappear. i listened to koze's "all people is my friends" the other day and you can tell he thought about what he was doing there. live recordings have a completely different feel, not bad, just different. i appreciate the time djs take to make their mix cds different from live sets, and i appreciate the intricacy they subject themselves to (though on a sidenote, i still dont know why koze does all that scratching on 'APIMF'). this is why i'm always buying mix cds; live sets are just different.

    3) soundboy punishments was reviewed by pitchfork and got a 7.2, though i agree that considering the label's quality and importance, this is a bit ridiculous. the idea that skull disco stopped due to the label's output sounding the same in nonsense. 'over here' and 'death is not final' are completely next level, especially the t++ remix. the last few skull disco releases changed dubstep and skull disco's sound completely.

    rant over

  8. @ minimill:

    I agree. Yet RA found it outside their remit to include Shackleton's game-changing 'Suicide Note' in their end-of-year polls, while a track as derivative and downright banal as Mole's 'Baby You're The One' made it in. Haven't taken them seriously since.

  9. @ minimill: Yeah, the scratching on APIMF is a bit of a head-scratcher (ahem) for me too ... although I think Koze had his start as a hip-hop DJ, which might explain it.

    I just read the Skull Disco review over at Pitchfork ... diminishing returns? WTF?

  10. Nice article, up there with other work from the same site, 8.6/10.

  11. The following deconstruction of that AC/P4K review, the Hipster Runoff response to said review and the critical circel-jerk that followed is worth reading within the context of your post:

  12. record reviews are a bit of a problem for me as well. my initial writing for RA was to do record reviews. i did one (J Fine on Fxhe) and then got a bit of the way through my second before i realised that i hated it and quit. i just can't think about music the way a "music reviewer" does.

    what we do at ISM is a bit different from just about any other place. instead of trying to cover EVERYTHING out there in our genres/styles of choice regardless of whether it is good and bad, we use our critical facilities as deejays to buy the same records we would normally and then just talk about them a bit. not relying on promos and the like is most definitely a big plus for what we do. essentially we have set it up so that you know what each reviewer's individual taste is thanks to their mixes and the fact that they only cover music they like. and if you like their taste, we give enough info for people to hopefully go check it out for themselves.

    essentially we are each a filter. that is what i am looking for when i look for reviews of anything from music to film to books. if someone is consistent in their taste, that's all i need. they can just give a yay or nay and i will go check it out. a 9.6 from a Pitchfork writer is meaningless to me. to be honest, there's nothing a Pitchfork writer could tell me about dance music that would be of any use to me. their monthly joints written by "experts" in the genres are about the only things i ever read on there.

    of course, when something that would ordinarily fall within our taste comes out and stinks up the joint, we're going to mention that, too. we are 100% behind being critical of bullshit. but we're not gonna review shit that is outside of our tastes in order to be "objective" about it or something. our opinions are valuable precisely because we are not objective. my talking about some generic Poker Flat record or some dubstep record is worthless to everyone. unless, of course, it is somehow outstanding and deserves to be played in one of my sets!

    i feel like our style falls somewhere in the middle of the 2 you point out. we do provide some small amount of sound description, but also try to give it a context since the sound itself is not really the defining part of the record. i think that is a good way to do it without sitting down and analyzing a record to death. i have never found anything like that to actually increase my enjoyment of any music. all my enjoyment comes from what happens as the music moves from my ears to my soul.

  13. RA have a penchant for the editorial team to mark down the scores the original reviewer gave.

    There are plenty of well documented and in some cases public examples of this, plus if you read the comments on some of the reviews on the site you'll find more..

  14. haha just realised that Pitchfork doesn't even do the monthly techno column and shit any more. shows how much i pay attention to that site ;)

  15. Gotta say I agree with quite a lot of what you say, PC. My own personal problem with pitchfork boils down to how it sometimes seems to take the slightly glib attitude toward reviewing that lifestyle mags like xlr8r trade in. Not that there's anything wrong with that. It's just that I'd rather read a review that makes me want to listen to an album, rather than one that makes me want to buy the t-shirt.

    As regards RA: the max score is a 4.5 now, which seems a little more genteel to me. My understanding is that the editorial staff are also now assigning the scores based on what's been said in the review. I reckon that the scores are a useful tool for those who just want to see whether to check a record out, rather than hear about it more deeply through a reviewer's commentary.

    @pipecock: you make an excellent point when you say that reviewers shouldn't take on records outside of their taste. In example, I have no real interest in jazz. I recognize what's great about it, and I really wish it pushed my buttons, but it just doesn't, unfortunately. For this reason, I'm not going to go out and review a jazz record. to do so would be useless at best. Pitchfork taking on Skull Disco sounds exactly like that to me. Talking about 'diminishing returns' gets the wrong end of the stick so completely, its clear the reviewer simply didn't understand the music very well.

    To me, dance music is rewarding in an important way: it speaks towards emotions that no previous music could. For most of its audience, however, I think edm is a more superficial thing than that. I can't find fault with that. Here's where I would tend to disagree a little with your essay Pete: I am very grateful to hear what the likes of the ssgs have to say. I don't think it's a crisis, however, if the Nathan Barleys are having their fun, too.

  16. who honestly reads pitchfork reviews anymore? they're a lifestyle magazine at this point.

    there are good places to keep up with newly released music on a regular basis. forcedexposure is probably the most diligent and expansive resource to check out what new records are distributed stateside. granted, the reviews quote one sheets and shit, but they do lots of cross-referencing. in general, it's a great database.

  17. I find the record reviews on Boomkat.com well written and informative. You can tell that the reviewer truly does have a passion and knowledge for the genre and LISTENS before putting pen to paper.

    There has been plenty of times I've flicked through a sample which i was willing to pass on then something about the review rang true for me which in turn made me appreciate the record more... Like I say, otherwise I would've passed on it.

  18. I told my students (architectural history) to look at this post for help with their essay (the bit where you outline the two different reviewing types).

    I think that bit is really well written.

  19. Firstly, thanks everyone for a great discussion - always a pleasure to read everyone's 'warm engagement'. Spread that warmth.

    @ Pitchfork: I think we'd be surprised just how many people read pitchfork, and the extent to which their reading 'frames' (like wayfarers, only cooler) the recording for its audience. The fact that this audience is increasingly a lifestyle category dominated by aesthetic considerations doesn't negate that to me... I think they're emblematic of the shift, and their popularity shows how they're both shaping the path and following it.

    ...I write from Melbourne though, which is indie rock city to the hilt... I doubt it has nearly as much traction in other places.

    BUT/BUT/BUT: the fact that Pitchfork engages with Skull Disco and finds it wanting, but doesn't feel any need to reflect on its own conditions of judgement (hey, maybe I don't like it 'cos I don't get it...) is also symptomatic.

    but/and at the same time, it seems important that people listen and critique out of their box.

    @ Pipecock on this tip: I *wish* ISM *would* engage outside its own cherished assumptions and the sovereignty of authentic (mostly black) innovators (which all comes back to this weird Platonic primordialism about people banging drums in caves) who have to be cherished and protected against (mostly white) imitators... it would be great if they would suspend the ambient prejudice against (for example) dubstep and bassline in order to cover it... but we all have our prejudices... (and it's not like SSGs cover either of those genres much, so mea culpa)

    @ Boomkat, Forced Exposure: its interesting that both these sites are there to flog records though. The fact that some people have said that the best critical reviews come from sites that are openly about trying to persuade you to buy something... isn't that cause for alarm?

    ~ I mean no offence to the people who write the blurbs for either site, they're well written ~

    Does this not show, however, an inability to distinguish (or care) about the distinction between editorial and advertising content?


    I worry that tipsterism amounts to lifestyle advertorial: 'check this out'; 'this is cool'; 'essential purchase'; 'must have' etc, etc.

    Finally, the broader point I wanna get back to is the poverty of cool as a way of engaging with the world.

    It is a nasty, cold, stiffening posture, and I'm sick of it.

    ...the dominance of cool is everywhere: witness even BBC's Top Gear and their car rating system.

    'It's a very good car (record)... but is it cool?'

  20. The funniest thing about the Pitchfork reviews is that if you read the columns they're all about the "subjective" aspect of music, but they insist each review is graded down to the decimal point. (Occasionally two decimal points, but usually only when the reviewer wants to make a sophomoric joke.)
    The 9.6 is definitely inflated - I loved the album, but 9.6 is really the kind of rating that indicates an instant classic or something. Compare it to the ratings they gave to "Album of the Year" winners - _Person Pitch_ by Panda bear got a 9.4, Deerhunter's _Microcastle / Weird Era Cont._ got a 9.2.

  21. i don't know why you would want us to do the exact same thing nearly every other techno blog is doing. that's not how we roll. if one of our writers was feeling dubstep, they would write about it. i don't think it's a coincidence that the people i chose to write for ISM are not feeling it.

    there's really no limit to what we cover to be honest, unless David Vunk's ridiculous italo selection is about protecting black music (which is something that we do, don't get me wrong!). we don't try to even cover 1/100th of all techno and house out there, we only give exposure to what we're feeling. nothing is forced.

  22. why would you need to read the blurb on boomkat when you can simply click play and make your own mind up?

  23. @ anonymous: Good question.... why have reviews if you can listen? I ask this as a genuine question...

  24. Q: why would you need to read the blurb on boomkat when you can simply click play and make your own mind up?


    Because not everyone has so much time that they can sit through sample after sample of new music, nor have the means eg. If you’re at work. A non biased review written by someone who’s opinion you value can be very helpful or even enjoyable. It’s not like you’re going to base your opinion solely on what was written. You can listen then make your own judgement. Quite simple really.

    Granted it could be argued that it's biased because they want you to ultimately buy the music but for the most part I find this not to be the case.

  25. @ Del: maybe it's that... maybe it's also that reading a review gives you much more framing.

    ...by this I mean: if you use an mp3 player, put it on shuffle for a day and try not to look at the tracks you can't quite identify. It's very hard to 'just listen' to music without identifying it.

    ...perhaps the review isn't quite, or isn't just 'all about the music'.

    Maybe this feeds in about what I was saying about decoding?

  26. Yes, I think it's exactly what you mean when speaking about decoding.

    A little information about the music you're listening to can be helpful. It can also put the recording into context which again for me can be important in understanding what the artist was trying to accomplish.

  27. "e) the fact that writing a compelling review that nails the recording is actually really, really, really difficult. Especially if the conclusions reached in the course of writing mean that the rating ends up having to be three stars."

    the normal distribution dictates that overwhelming majority of records will be in this broad middle bracket, yeah? most records fit "some high points, some flat spots, your mileage may vary".

    it's annoying for a reviewer that after having listened five or ten times it's still not categorically good or bad, but that's the nature of the beast. 2.5 stars, bump it up slightly for charity's sake, 3 stars. NEXT!

  28. one thought on a contributing factor to the upwards creep of reviews at pitchfork, and maybe some other sites, offered without any research:

    1. reviewer gets a new album, gets excited about it partly because it's new.
    2. reviewer checks archives to see what the last album by that band got, or some other comparable item.
    3. reviewer thinks, "well that got an X. I like this one more, so it must get at least an X+.5"
    4. reviewer publishes review.
    5. time goes on, initial enthusiasm dies down.

    ..and loop back around for the next album, which receives the positive glow of newness, and therefore has to be better than an X+.5 because it's *so* much better than the last one, which got an X+.5.

    in short, when you're holding the shiny new thing in your hands, you forget how shiny the old thing used to be.


  29. This comment has been removed by the author.

  30. Anonymous is totally right. The recency bias is always going to create more of a buzz for a reviewer on discovering a new band or album.

    Perhaps then the best approach would be to take on the albums, give them time to bed in, review much later on.

    This takes us back to the reviewer reviewing material he/she would buy though because I know you sure as heck wouldn't get me giving TinchyStryder the benefit of a "bed in" period.

    As an aside, thank you for this post and all these comments. You've given a gal the chance, in a world where her friends don't understand the importance of a good review (or music for that matter), to enjoy the pleasure of a proper virtual/internal debate.

  31. You've hit some nail on it's head when you conclude with this statement: "If only we took the time to write carefully, to read carefully and to foster this culture of warm engagement (as opposed to the dominant culture of cool disengagement), we’d all be making an infinitesimal difference, and that adds up."

    HOWEVER, all this to'ing and fro'ing over sources like P4k and other "gatekeepers" of taste is not getting at the power from below that you call for in order to engage with music lovers. Isn't that the real point? Do we, as folks who write about music want to intimately engage with folks who want to hear about music or not? If engagement is what we desire then let us make a forum wherein that is possible.

    Enter the blog. And here is why blogs succeed and why music blogging increasingly becomes a vital source for information and for taste-making. The authority that was once vested in famous music editors is now much diluted by the proliferation of less famous but influential bloggers. Many will listen to something with a kinder ear if a blogger they think is cool promotes it.

    Now, I've just contradicted myself haven't I? Blogs are this intimate site of engagement yet a source of "cool disengagement" at the same time. Well, they don't have to be if bloggers take the time to write personal, introspective, engaging reviews of either your types 1 or 2. Let's just stop with these ridiculous four line descriptions and magic formulas (this band is like band x plus band y with a dash of band z all on steroids and jumping through a hoop of flames, i.e., useless bullshit.

  32. "...if bloggers take the time to write personal, introspective, engaging reviews..."

    a great blog which sometimes does this is the delightful teleosteopathy:


  33. @ a Tart: As for reviewing, I think it's extremely important to be descriptive. If you're writing as someone with no music theory to people with no music theory, sometimes the band x + band y + hoop 'on acid' thing can work, but yeah, usually it's just lazy.

    ...my view is that all reviews are interpretations. It's not a lab report, and this pretense toward 'objectivity' is rubbish...

    ...having said that though, as partisan as each reviewer should openly be, sledging for sledging's sake is unproductive. Even if/when you don't like something, you need to concede enough space to demonstrate that as personal and that the failings are in the context of other qualities...

    ...a lot of this can be achieved by being descriptive, by giving the benefit of the doubt, and conceding your peculiarity.

    ...but if it's a descriptive review and you get to the end and still have no idea what the recording sounds like, this is a failure of reviewing, to me.


Say something constructive, bitte. Or if you're gonna take a swipe, at least sharpen your nails.

Note: Only a member of this blog may post a comment.