Marcus and I had an email exchange about Wikipedia while I was at the NLM Long Range Planning Panel meeting last week, sparked by his coming across the Nature article comparing Wikipedia and Brittanica. He noted some of the changes Wikipedia will be making in order to address some of the legitimate criticisms that have been made, specifically that they'll be introducing a process for tagging an article as "stable" once it has reached a certain "quality threshold" -- that threshold to be determined by having users rate article quality.
There's an interesting epistemological issue here, and it is much broader than just Wikipedia -- what does it mean for us to know something? What, in fact, is "the truth"? The self-correcting principle that Wikipedia relies on could mean: give enough monkeys a typewriter and eventually they will inevitably come up with the truth; or, the truth is simply whatever enough monkeys eventually agree on. These are very different things.
The principle of "authority" in traditional reference works, now so derided by wikipedia fans, is based on the principle that there is an actual objective truth to be discovered, and that it is best explicated by someone who has made a special study of the case and is willing to put their name to it in order to be tested and to have other experts challenge their findings. (There is no implication that the "expert" might not get it wrong, and, of course, much of the meat of academic discourse is made up of the disagreements of experts.) But underlying the "hive-mind" approach to wikipedia is the notion that the truth is whatever enough people agree on. Is this the wisdom of the hive-mind or the tyranny of the mob-mind?
Consider Wikipedia's discussion of "open access." While it gets many of the facts right, the tone of the entire article assumes that open access is an unmitigated positive. There is much on the advantages of open access, with very little discussion of opposing views (the Highwire Press and DC Principles approaches & criticisms are not, for example, even mentioned). While I personally agree with much of the article, it is ultimately a work of advocacy, not an objective, balanced presentation of the issues.
So what does that say about "truth"? If the majority of the wikipedia editors (who've bothered to look at the article) agree with the tone, does that make it true? I suspect that under a user-rating scheme, the majority of wikipedia readers would be inclined to think the open access article is good, because it supports their own biases (supporters of traditional publishing practices are unlikely to be avid wikipedia users). So would it be tagged as "stable," i.e., the definitive word?
In the long run, I completely agree that wiki technology offers the promise of preparing better reference works, more quickly updated, with more of an opportunity for more voices to be heard. But we need to give more thought to what it means to say that something is true and reliable. The history of lynching in America, to give just one particularly horrifying example, should give us pause whenever we think to rely on the wisdom of crowds.