The Ratings Conundrum

Some time back, the Oregonian's main movie reviewer was a guy named Ted Mahar. His reviews were generally positive--and this was a problem. It got to be that we nick-named him "three-star Mahar" because he never really panned or gushed about a movie. Everything was, in the manner of the children of Lake Wobegon, above average. And therefore his reviews were of little use.

Not too long ago, a friend of mine castigated me for turning into a three-star Mahar. It's true--most of my ratings fall in the range of A- to C+ (here's what they mean). I plead helplessness, though: most beers here fall in that range, too. Obviously, few beers qualify for the "world class" standard ("a superlative example of the style or an exceptionally original beer"). But few fall below the standard attained by a C+ beer, either ("a well-made beer that is a fairly common example of its style or a near miss on originality").

I am, obviously, a homer, and so you have to suspect me of putting my thumb on the scale. But scan through the aggregate scores on BeerAdvocate and RateBeer--they're no better. This may be due to different factors, but the upshot's the same. With only very rare exceptions do breweries put out either badly ill-conceived recipes or beer with off-flavors. We have a problem of compression.

I'm almost to the point of abandoning ratings altogether and just letting the descriptions stand on their own. (I've actually done that more and more with recent reviews.) It's easy enough to change the scale so there are more calibrations between "common" and "world class," but that doesn't exactly resolve the problem. Worse, it exaggerates the effect of personal preference. I'd be willing to entertain some wholly novel style of ratings, though, if only I could think of one. Any suggestions? How do you make sense of subtle differences between beers?