Technology Review: Wikipedia and the Meaning of Truth

My colleague David Stanley links to this article.

Many people, especially academic experts, have argued that
Wikipedia’s articles can’t be trusted, because they are written and
edited by volunteers who have never been vetted. Nevertheless, studies
have found that the articles are remarkably accurate. The reason is
that Wikipedia’s community of more than seven million registered users
has organically evolved a set of policies and procedures for removing
untruths. This also explains Wikipedia’s explosive growth: if the stuff
in Wikipedia didn’t seem “true enough” to most readers, they wouldn’t
keep coming back to the website.

These policies have become the social contract for Wikipedia’s army
of apparently insomniac volunteers. Thanks to them, incorrect
information generally disappears quite quickly.

So how do the Wikipedians decide what’s true and what’s not? On what is their epistemology based?

Unlike the laws of mathematics or science, wikitruth isn’t based on
principles such as consistency or observa­bility. It’s not even based
on common sense or firsthand experience. Wikipedia has evolved a
radically different set of epistemological standards–standards that
aren’t especially surprising given that the site is rooted in a
Web-based community, but that should concern those of us who are
interested in traditional notions of truth and accuracy. On Wikipedia,
objective truth isn’t all that important, actually. What makes a fact
or statement fit for inclusion is that it appeared in some other
publication–ideally, one that is in English and is available free
online. “The threshold for inclusion in Wikipedia is verifiability, not
truth,” states Wikipedia’s official policy on the subject. —Simon L. Garfinkle, Technology Review