Wiki variations

In the beginning there was Wikipedia, and it was brilliant. Somehow, making a set of pages that anyone could edit worked. The result was not cacophony but the greatest public collection of knowledge that the world has ever known. And that’s pretty much where we’ve left things, which is a great shame, because there’s so much more to be explored here.

A set of revision-controlled, hyperlinked topic pages is a stupidly useful form. It seems too simple to improve. What we can experiment with is how the pages are produced — which really seems like a far more interesting problem anyway. We can also look at novel ways to use a wiki. Here’s a brain dump of all the different directions I can imagine pushing the classic concept.

Who can edit? Just because Wikipedia is open to all doesn’t mean that all wikis must be. Actually, not even Wikipedia is open to everyone; admins can “protect” pages, restricting editing in various ways temporarily or permanently, or in extreme cases ban users entirely. But the presumption is openness. There are other wikis that start the other way around, such as news organizations’ “topic pages” which are only editable by staff. This control often results a much more consistent product and may also serve to minimize errors, though I’ve never been able to find a quantitative comparison versus pro journalism’s error rate. But the cost of being closed is that no one else can contribute. And sure enough, on most topics I find Wikipedia to be more comprehensive and up-to-date. Compare NYT vs Wikipedia on global warming.

Between entirely closed and entirely open there is a huge unexplored design space. The Washington Post’s WhoRunsGov, a directory of American government personell, was an example of what I’m going to call a “moderated wiki.” Anyone could submit an edit, but the changes had to be approved by staff before going up. WhoRunsGov is no longer up, so perhaps it was not considered a success, but I don’t know anything about why.

There are lots of other in-between possibilities. We could have a post-moderated wiki where changes are visible immediately but checked later, or employ any of the various reputation systems that are commonly used in community moderation; the basic idea is that proven editors have greater privilege and control. I can also imagine a system where all content is written by a small closed group, perhaps the staff of some organization, but the community votes on what articles need to be updated, and submits suggestions, links, etc. The staff then updates the pages according to the community priority. Openfile.ca embodies certain aspects of this.

Another simple variation: I have not yet seen a publicly visible wiki that is editable by everyone within a large organization (as opposed to a few sanctioned authors.) Organizations and communities already have elaborate structures for deciding who is “in” and who is “out,” and this could translate very naturally into editing rights.

Specialized Wikis. It’s going to be extraordinarily hard to produce a better general reference work than Wikipedia, with its millions of articles in dozens of languages and tens of thousands of editors. But your organization or community might know far more about finance, or green roofs, or global media law, or… Each topic potentially has its own community and its own dynamics that could lend itself to different types of editing schemes.

For that matter, Wikipedia’s content is freely re-usable under its Creative Commons CC-BY-SA license. It would be perfectly permissible to build a wiki interface that displayed specialized pages where available, and used Wikipedia content where it is not. Essentially, this is the choice to take editorial control of a certain small set of pages, while retaining the broad utility of a general reference.

Combine wiki and news content. For most people, the news isn’t really comprehensible without detailed background information. And vice versa: after reading a wiki article, I’m probably far more interested in the most recent news on that topic. It seems natural to build a user interface that combines a wiki page with a news stream on that topic, and several news organizations have tried this. But I haven’t found an example that really sings. For me, this is largely because they don’t leverage the broader world of available content. Where is the Wikipedia/Google News mashup?

The revision history of a page, the list of every edit over time, is also a form of  recorded news. James Bridle’s 12 volume edit history of “The Iraq War” makes this point beautifully. His work is paper performance art, but the concept has a natural online interpretation: a wiki that automatically highlights the sentences that have changed since the reader last visited that page. Rather than asking readers to construct the whole story from the updates, we would be showing them where the updates fit into the whole story. At least one experimental news site has tried this.

Authorship tracking. Although it is possible in principle to use the revision history on any Wikipedia article to determine who wrote what, both the culture and the user interface discourage this. This is not the only option. The U.S. intelligence community has Intellipedia, which logs authorship:

It’s the Wikipedia on a classified network, with one very important difference:  it’s not anonymous.  We want people to establish a reputation.  If you’re really good, we want people to know you’re good.  If you’re making contributions, we want that known.  If you’re an idiot, we want that known too.

This also works the other way around, where reputation of the author translates into credibility of the text. I’m not clear on exactly how Intellipedia’s attribution system works; perhaps it simply requires authenticated user logins, or maybe it includes UI features such as appending a user name to each contributed paragraph. One could also imagine systems that constructed a list of bylines based on who wrote how much in the current article. The “blame” function of software version control systems is technical precedent for automatically tracking individual contributions in a collaboratively edited file.

Sourcing and attribution. Wikipedia has three core content policies: neutral point of view (NPOV), no original research, and verifiability. Together, these policies describe what counts as “truth” in the Wikipedia world. NPOV is roughly equivalent to the classic notion of journalistic “objectivity,” no original research says that Wikipedia can never be a primary source, and verifiability says that all statements of fact must be cited (and defines, loosely, what counts as a reputable source for citations.)

The citation system used to enforce verifiability has its roots in age-old scholarship practices, while the no original research policy was originally drafted to exclude kooks with fringe theories. Together they have another extremely important effect: they offload the burden of credibility. Without these policies, the credibility of information on Wikipedia would have to lean far more heavily on the reputation of its authors; difficult to establish, since neither authorship nor authors are well tracked. By depending on the credibility of outside sources, Wikipedia was able to bootstrap from existing systems for authoritative knowledge, while maintaining the flexibility to incorporate any reasonable source.

There’s no reason that an already-credible organization couldn’t choose differently. Scientific journals, news organizations, government agencies etc. routinely act as the original publisher of crucial information, and it seems a small step to say that they could put that information in a wiki. The wiki would be credible to the extent that the organization is considered a credible author, which means that authorial tracking would also be required; perhaps certain “source” pages could designated read only, or all edits could be moderated, or there could be fine-grained attribution of text. They key point is that the user interface clearly distinguishes text that has been authoritatively vetted from text that has not.

Shared text. We need shared texts because we need shared understandings of the world. Without them, collective action becomes impossible and we all suffer. Wikipedia is an ambitious project to create a global knowledge system that is more or less acceptable to all people. The neutral point of view policy is important here, but the wide-open nature of Wikipedia is perhaps more essential to this vision. By definition, a consensus article is something that everyone is happy with; if an article ever reaches a state where it is amenable to all factions, there is no motivation for anyone to edit it further. That this happens for so many pages, even on contentious topics, is remarkable. The mechanics of this process are actually fairly extensive, including an elaborate tiered volunteer dispute resolution process that usually stabilizes edit wars.

There are variations here too. We could explore other methods of dispute resolution, or we could get more sophisticated about Wikipedia’s policy of representing multiple points of view. We could try to map the viewpoints of different authors directly, or we could have multiple versions of a page, each open to a different faction, and then compare the resulting texts to better understand where the differences lie. As always, there is no reason to imagine that “completely open” is the only option; but some openness seems essential.

And this cuts to the heart of what is unique about the wiki form. Open texts have a special legitimacy precisely because they are fragile: they can only exist when all who have an interest in the outcome manage to work together to create and preserve them. Wikipedia shows that this is possible in many more cases than we thought, but it is hardly the final word.