This Piece About Wine Tasting Being Bullshit Is Bullshit


This Piece About Wine Tasting Being Bullshit Is Bullshit


The primary focus of mankind has long revolved around two functions: 1) how to distinguish oneself from others in order to gain access to a more appealing group of potential sexual partners, and 2) how to get those potential sexual partners drunk enough to bone. Not surprisingly, wine has been at the center of both goals for centuries. Another sort of brand differentiation has evolved in more recent years to aid in the former pursuit: the contrarian blog post. This one today from i09 titled “Wine tasting is bullshit. Here’s why” ties all three together in a tidy package.

“There are no two ways about it: the bullshit is strong with wine. Wine tasting. Wine rating. Wine reviews. Wine descriptions. They’re all related. And they’re all egregious offenders, from a bullshit standpoint.”

The writer goes on to detail a series of studies and tests over the past decade or so, using a “recent” (2009) one done by statistician and wine-maker Robert Hodgson, that show the inconsistency at the heart of wine reviews. One of Hodgson’s studies found that the numerical value given to wines by expert tasters varied significantly enough to prove, as the post extrapolates, bullshittery is afoot. Other examples given detail a sort of greatest hits of the anti-wine snob pantheon, from the famed one where testers couldn’t tell the difference between a red and a white when blindfolded, to how outside factors, the setting, the time of day, the comfort level of the reviewer, alter the outcome of the rating. GOTCHA WINE NERDS.

The problem here is that there isn’t a single serious-minded person who cares about the quality of alcoholic beverages that wouldn’t admit to such readily. Is it really news that critics are human beings and not data crunching robots? This plays out in any number of other fields as well. I don’t want to shock anyone out of their stupor of critical purity here, but movie, music, and art critics are often influenced by their humanity as well. Does anyone actually think that there is a measurable statistical significance between, say, a record getting an 8.4 and a 7.5 on Pitchfork?

1) No, no one thinks this.

2) This blog post is bullshit.

A blog post being bullshit isn’t news, of course. But anti-wine pieces like this, and they pop up with regularity, do more than champion a populist anti-wine snob stance. They actually make an argument for mediocrity. It’s a reductive, reactionary approach to the idea of quality itself. No one believes that a numerical review from a wine publication actually falls within a Platonic margin of error of +/- 5 points. It’s just a convenient shorthand that we’ve all agreed upon to use to begin a consumer conversation about a product. Exposing the “lie” at the heart of wine reviewing like this always carries with it the suggestion that there is no such thing as any sort of meaningful discussion to be had about quality. It’s an argument for blandness. People unconcerned with quality can point to stories like this and say: See, I told you that wine business was all fake, and then proceed to order the most readily-available, mass-market produced swill on hand, content in the knowledge that those fancy snoots are sniffing their own wine farts.


There is such a thing as quality in wine, and food, and spirits, it just isn’t easily reduced to a numeric ranking. Anyone who tells you otherwise is lying or stupid. Their taste gets a 45%.

Follow @lukeoneil47