When news broke on May 8 about the arrest of a half-dozen young Muslim men for supposedly planning to attack Fort Dix, alongside the usual range of reactions — disbelief, paranoia, outrage, indifference, prurience — a newer one was added: the desire to consecrate the event’s significance by creating a Wikipedia page about it. The first one to the punch was a longtime Wikipedia contributor known as CltFn, who at about 7 that morning created what’s called a stub — little more than a placeholder, often just one sentence in length, which other contributors may then build upon — under the heading “Fort Dix Terror Plot.” A while later, another Wikipedia user named Gracenotes took an interest as well. Over the next several hours, in constant cyberconversation with an ever-growing pack of other self-appointed editors, Gracenotes — whose real name is Matthew Gruen — expanded and corrected this stub 59 times, ultimately shaping it into a respectable, balanced and even footnoted 50-line account of that day’s major development in the war on terror. By the time he was done, “2007 Fort Dix Attack Plot” was featured on Wikipedia’s front page. Finally, around midnight, Gruen left a note on the site saying, “Off to bed,” and the next morning he went back to his junior year of high school.

Photo
Net Neutral Matthew Gruen, 16, is one of thousands of Wikipedians devoted to policing Wikipedia's neutrality. Credit Danielle Levitt

Wikipedia, as nearly everyone knows by now, is a six-year-old global online encyclopedia in 250 languages that can be added to or edited by anyone. (“Wiki,” a programming term long in use both as noun and adjective, derives from the Hawaiian word meaning “quick.”) Wikipedia’s goal is to make the sum of human knowledge available to everyone on the planet at no cost. Depending on your lights, it is either one of the noblest experiments of the Internet age or a nightmare embodiment of relativism and the withering of intellectual standards.

Love it or hate it, though, its success is past denying — 6.8 million registered users worldwide, at last count, and 1.8 million separate articles in the English-language Wikipedia alone — and that success has borne an interesting side effect. Just as the Internet has accelerated most incarnations of what we mean by the word “information,” so it has sped up what we mean when we employ the very term “encyclopedia.” For centuries, an encyclopedia was synonymous with a fixed, archival idea about the retrievability of information from the past. But Wikipedia’s notion of the past has enlarged to include things that haven’t even stopped happening yet. Increasingly, it has become a go-to source not just for reference material but for real-time breaking news — to the point where, following the mass murder at Virginia Tech, one newspaper in Virginia praised Wikipedia as a crucial source of detailed information.

So indistinct has the line between past and present become that Wikipedia has inadvertently all but strangled one of its sister projects, the three-year-old Wikinews — one of several Wikimedia Foundation offshoots (Wikibooks, Wikiquote, Wiktionary) founded on the principle of collaboratively produced content available free. Wikinews, though nominally covering not just major stories but news of all sorts, has sunk into a kind of torpor; lately it generates just 8 to 10 articles a day on a grab bag of topics that happen to capture the interest of its fewer than 26,000 users worldwide, from bird flu to the Miss Universe pageant to Vanuatu’s ban on cookie imports from neighboring Fiji. On bigger stories there’s just no point in competing with the ruthless purview of the encyclopedia, which now accounts for a staggering one out of every 200 page views on the entire Internet.

The tricky thing is, the process by which Wikipedia usually, eventually gets things right — the notion that mistakes in a given entry, whether intentional or unintentional, will ultimately be caught and repaired as a function of the project’s massive, egalitarian oversight — doesn’t seem as if it would work when people are looking for information about events unfolding in real time. How on earth can anyone be trusted to get the story right when any version of the story is only as accurate, or even as serious, as the last anonymous person to log on and rewrite it?

Nothing is easier than taking shots at Wikipedia, and its many mistakes (most often instances of deliberate vandalism) are schadenfreude’s most renewable resource. But given the chaotic way in which it works, the truly remarkable thing about Wikipedia as a news site is that it works as well as it does. And what makes it work is a relatively small group of hard-core devotees who will, the moment big news breaks, drop whatever they’re doing to take custody of the project and ensure its, for lack of a better term, quality control. Though Wikiculture cringes at the word “authority,” in a system where a small group of people has the ability to lock out the input of a much larger one, it’s pure semantics to call that small group’s authority by any other name. Still, the only way to install yourself in that position of authority on Wikipedia is to care about it enough. So who are the members of this all-volunteer cadre, and why should it matter so much to them whether Wikipedia is any good at all?

Jimmy Wales, the founder and watchmaker-god of Wikipedia, isn’t completely sure who they are either, but that’s fully in keeping with the radically decentralized culture of the project itself. (Wikipedia last did a survey of its own users in 2003, and it has no plans for another one.) He did bristle when I suggested that they tended to be in their early 20s or even younger — editing encyclopedia entries, he said, is “not a young person’s hobby” — but after we traded stories for a while, he admitted that his own evidence was as anecdotal as my own.

“I’m always traveling and meeting Wikipedians everywhere,” he said, “but there’s probably some bias in the sample that I get, in that the very youngest may not be able to go out and meet us on a Thursday night for a beer because they have school the next day. So I may get a bit of a skew.”

Wales is a soft-spoken man with the unnerving focus of a person who sees something nobody else in the room sees. The word that comes up most often in his conversation is “interesting,” and that’s as good a key as any to understanding the mind-set of someone who has put in motion a public project so vast that he no longer has any real power over it. Not that such power is anything he covets. In his former life he made a killing as an options trader in Chicago, but his manner is much more monastic than that might suggest. We met at his rather shockingly modest hotel just south of Times Square, and while I could not source this statement well enough to satisfy an ardent Wikipedian, I would feel comfortable wagering that he is the only person on Time magazine’s list of the World’s 100 Most Influential People to have recently passed a night in that particular fleabag.

Wales’s own modesty sets the tone for the whole enterprise; still, Wikipedia does feature — though many users will deny it zealously — a kind of rudimentary institutional hierarchy. Among the 4.6 million registered English-language users are about 1,200 administrators, whose “admin” status carries a few extra technical powers, most notably the power to block other users from the site, either temporarily or permanently. Those nominated for adminship must answer an initial series of five questions, after which other users have seven days to register their approval or disapproval. Above the admin level are the cheekily named “bureaucrats,” who are empowered to appoint the admins and will do so if they deem a user consensus has been reached (the magic number is somewhere around 70 percent approval). There is also a level above the bureaucrats, called stewards, of whom there are only about 30, appointed by the seven-person Wikimedia Foundation board of directors. The higher up you go in this chain of authority, the humbler the language they use to describe their status: they compare themselves frequently to janitors or, more tellingly, to monks. There is an unmistakably religious tone to this embrace of humility, this image of themselves as mere instruments of the needs and will of the greater community. (The encyclopedia’s guiding principles are known as the Five Pillars.) The level of devotion to this ideal can get a little cultlike: one admin insisted to me that the vote by which he was elevated was not a vote at all but a “community consensus,” though he allowed that the means by which this consensus was reached did have “votelike tendencies.”

Though Wales is right that there are plenty of devoted Wikipedians out there who are upward of 25 years old, most of those who do the hard-core editing on a breaking news story seem to be at the younger end of the spectrum. Part of the reason for that may be that high-school and college students are much more likely than older folks to have six or eight hours at a stretch to devote to something on the spur of the moment. But there is also something uniquely empowering — for better or for worse — about Wikipedia, in that there is no real organizational ladder to climb: since everyone contributes behind screen names (which may or may not match their real ones), questions of age, appearance, experience and so forth don’t color the discussion. The only way to achieve a degree of authority in the world of Wikipedia is to show sufficient devotion to it, and that can happen in relatively short order. Gracenotes, for instance, was considered for admin status in part for his work on the Fort Dix story, and in part as a simple consequence of the fact that he will often, after his homework is done and his church responsibilities are fulfilled, spend six hours or more a night cleaning up errors in the encyclopedia. An amateur programmer and calculus buff who lives near Poughkeepsie, N.Y., he became seriously involved with Wikipedia just about eight months ago, after his parents ordered him out of a different online community of which they did not approve.

When you’re talking about Gracenotes and those Wikipedians like him — people who, though they work very hard, generally do so without leaving their bedrooms — what does “news” even mean? The presentational difference is that Wikipedia’s version of events comes in the form of one constantly rewritten, constantly updated, summary article, rather than a chronological series of articles, each reflecting new developments, as newspapers and even most news sites do. But much more significant than that, no Wikipedia article contains any attempt at actual reporting — in fact, original research is forbidden.

The rule, according to Wales, is “not out-of-context absolute” — if he, or some other trusted Wikipedia user, happened to be present at some catastrophic event and took a picture of it, that picture wouldn’t necessarily be removed from the site — but in general, he explained, “it’d be too easy to be hoaxed. And anyway, an encyclopedia is really not where you should go for that. Britannica doesn’t publish original research. An encyclopedia is the condensation of received wisdom.”

For real-time received wisdom, there’s pretty much one place to go in today’s world, and that’s Google. Thus Gruen fleshed out the Fort Dix story entirely by searching sources on Google and its offshoot, Google News. During the editing frenzy on May 8, he told me, “There was one dispute where somebody thought we should be using the word ‘alleged’ a lot more than we were, because it was, like, how do you know they were really planning on doing it? But I was kind of against too much use of ‘alleged,’ because, well, I don’t know, I just kind of felt that the F.B.I. was a pretty reliable source.” At which point thousands of dead journalism professors turned over in their graves.

But even when Wikipedia’s function is journalistic, its aim is not; rather than report the news, the goal is to act as a kind of phenomenally fast, bias-free digest of what others have already reported elsewhere. On a big news day, Wikipedia functions like a massive, cooperative blog — except that where most blogs’ function is to sieve news accounts through the filter of strong opinion, Wikipedia’s goal is the opposite: it strives to filter all the opinion out of it. With 10 or 20 or 50 pairs of eyes on every available news account, if one fact, or one loaded word — “terror,” say — appears in one of those accounts but not the others, Wikipedia’s own version will almost always screen it out. Not exactly investigative journalism, but it doesn’t pretend to be; it relies on others for that.

Natalie Martin, a 23-year-old history major at Antioch College in Ohio, was granted admin status last winter after contributing to the site for about nine months. She thought at one point in her life that she wanted to be a journalist, she said, “but then I decided that my only real interest in newspapers is fixing all the comma mistakes.” Martin works at the circulation desk of a local library — a job that often leaves her attention less than fully engaged, in which case she logs onto Wikipedia and looks for errors. Her usual M.O. is to check the “recent changes” page, a running log of the most recent edits made anywhere on the site, no matter how large or small. It gives you some sense of the project’s scale to learn that the roughly 250 most recent changes to the English-language Wikipedia were made in the last 60 seconds.

On a normal day, she told me, much of her work involves finding and instantly reverting vandalism, usually profanity from bored schoolchildren. Messing with a Wikipedia page requires no hacking skills whatsoever; thus vandalism is pandemic there. Though the admins are loath to give vandals special attention in any form, the fact is that there are some who earn their grudging admiration, if only for their sense of humor. Stephen Colbert, in his fake-newsman persona at least, has been a regular tormentor of the site, urging his viewers to change a given fact en masse; when the words “Colbert Alert” appear on the admins’ chat forum, 20 or more of them will rush to the ramparts of a targeted page. And users with a mind-boggling amount of time on their hands can sometimes raise their defacement to conceptual levels — for instance, Willy on Wheels, a legendary Wikipedia user whose vandalism consisted of adding the words “on wheels” to the headlines of literally thousands of entries. Uncountable hours were spent deleting those two graffitilike words from all over the site. He or she even spawned numerous copycats, and to this day, anyone whose user name contains the phrase “on wheels” runs the risk of having his or her editorial privileges summarily revoked.

On April 16, though, Martin’s routine vandal-catching and grammar-policing was altered by news of the shootings at Virginia Tech. By the time she got to the site, a few hours after the shooting itself, the brand-new “Virginia Tech Massacre” page had already been contributed to or vandalized hundreds of times, but she took control and has personally made more than 200 edits to the story. “It happened rather quickly,” she said, “and there were maybe a dozen people that were paying very close attention: information would break, and we would talk about how it should be phrased, how the gun-politics stuff should be phrased in a way that’s neutral and doesn’t use some of the loaded terminology that each side tends to use. I was not online when the name of the gunman was released, but I imagine somebody went and added it to the page within 30 seconds. Because that seems to be what happens.”

She and that dozen or so others decided to use one of their technical privileges as admins to “semiprotect” the page, meaning they have locked out any would-be contributions from anonymous users or users with registered accounts less than four days old. (The more seldom invoked “full protection” prevents anyone but admins from editing a page.) They made the decision, Martin said, because the level of vandalism was “just ridiculous. Sometimes it’s not malicious, it’s people who want to put in their opinion, or they put in ‘Go Virginia Tech!’ or something like that, which is sweet but really inappropriate in this particular venue. People liked to append the word ‘evil’ in front of the name of the perpetrator, and that’s again, like, O.K., sure, I don’t know or care if he’s evil, really, I don’t even know what that means, but it doesn’t belong in an encyclopedia.”

Martin was self-deprecating about her reasons for devoting so much time to Wikipedia — spending your leisure time hunting down strangers’ spelling errors does, she said, “feed interesting character traits” — but when pressed she offered a rationale that might seem disingenuous if it weren’t echoed by so many of her colleagues: pride of ownership. Virginia Tech, she said, “was one of the more visited pages for a few days, and I know that for myself and probably for some other people there was a desire to put our best face out to the world. A lot of people did go to the page just to look and see, and it was important to me that they see something very good, very professional and put together, not covered in vandalism, not excessively long, not excessively editorialized, just the best that we could do with the limited information that we had.”

Martin’s main comrade on the day of the Virginia Tech shootings was a fellow admin she has never met: “I don’t know his real name,” she says, “but his handle would be Swatjester.” Swatjester turns out to be Dan Rosenthal of Palm Beach Gardens, Fla.; Rosenthal, who is 24, graduated in April from Florida State University, a little later than planned, because the National Guard unit of which he was a member was deployed for the initial invasion of Iraq and stayed there for a year. He is now the national legislative director of a war-veterans’ organization, doing everything from answering questions about pending legislation to helping suicidal vets find mental-health treatment. In the fall he will start law school at American University in Washington. On top of all that, he is such a devoted Wikipedian — editing and resolving disputes on the site eight hours a day or more, with a watch list that has ballooned to 4,000 articles — that he recently made a pop-in visit to Wikimedia’s modest headquarters nearby in St. Petersburg and left with an unpaid internship in legal affairs. His parents, he told me, are only marginally happier with this pursuit than they were back when he spent hours each day playing video games.

I visited Rosenthal’s student apartment in Tallahassee, where he was living with his three roommates; the place mushroomed with musical instruments and video games and unwashed dishes, just like any student habitat, and maybe for that reason, Rosenthal will often take his laptop to a local cigar bar and fulfill his admin duties there while having a Scotch. On the afternoon we met, though, we settled for a quiet sushi restaurant in a local strip mall with a wireless connection. “I’m trying to teach myself to eat more slowly,” he said with a smile. “It’s an Army thing. I’m used to having 20 seconds to eat, and if you talk that means you’re finished.”

You might think, having experienced the Iraq war himself, he would be tempted to correct mistakes or look for N.P.O.V. (“neutral point of view”) violations on Wikipedia’s many war-related stories, but he’s far too much of a true believer in the project to allow himself to do it. “In the beginning,” he told me, “when I didn’t quite understand all the rules, I got a little bit involved with it, but as I started to learn the system better, I just don’t edit that kind of thing anymore. It sets a bad precedent.”

He logged on and took me through an average editing session; unsurprisingly, everything went by at about the speed of light, but a few things about Wikisociety, as he called it in a moment of professed weakness (“I hate putting ‘Wiki’ in front of everything”), did become clear. For one thing, it’s a myth that any entry, no matter how frivolous, can find a place on Wikipedia — or, rather, the myth is that anything that goes on there stays on there. The presence of (in one admin’s embarrassed phrase) “five million Pokémon articles” notwithstanding, a great many entries are deemed unworthy even of Wikipedia’s catholic attention and are deleted within days, hours or even minutes. There are elaborate criteria for deletion or, in extreme cases, immediate deletion (“patent nonsense,” for instance, is grounds for the latter). But another extraordinary aspect of Wikipedia is its almost fanatical transparency: every change made to every article, no matter how small, is preserved and easily accessible forever. Exceptions can be made in rare cases, as when a vandal adds to a page somebody’s home phone number or — as in a recent controversy that left the more diligent admins exhausted — open-source buccaneers start randomly inserting the legally controversial HD/DVD de-encryption code into articles all over the site.

By the time we finished our lunch, 96 new pages had been nominated for deletion already that day — the backlog can grow to as many as 800 — and Rosenthal delivered the death blow to a few others while we sat there. “Take this guy,” he said. “ ‘Self-proclaimed write-in candidate for U.S. president.’ This article’s gonna get deleted. There’s no hope for it.”

There is a rough philosophical divide among admins between “deletionists” and “inclusionists,” and Rosenthal, even though he has never actually met anyone from Wikipedia outside the office staff in St. Petersburg, is on the site so often that he has a pretty good idea who’s who. He associates certain user names with certain political biases, and he recalls an online dustup with someone called Slimvirgin over whether the Animal Liberation Front was a terrorist organization. Personalities can become so pronounced in these debates that some even achieve fame of a sort on snarky Wikipedia anti-fansites like Encyclopedia Dramatica, where Slimvirgin has been thoroughly pilloried. “It’s disgusting on one level,” Rosenthal said, “but it’s also funny how the encyclopedia has gotten to be more about the community behind it. And like any community, it has its drama. For people that don’t understand it and don’t have an inclination to get involved in it, it’s pretty daunting.”

Wikipedia’s morphing into a news source, Rosenthal said, “is an inevitable step. Because the software is absolutely perfectly suited to that. And the rules, I’m sure unintentionally, are perfectly suited to it, with the emphasis on verifying and the neutral point of view.” As for Wikinews, he offered, with brutal kindness, that it was a good place for news that “doesn’t make Wikipedia’s radar.” Even Wales admitted, with something of a wince, that on any substantial news story, Wikinews is pretty well consigned to redundancy by its more successful sibling. “It’s something that the Wikinewsies have at times, I’ve felt, been a little prickly about,” Wales said. “But it’s just the sheer volume of people, also the sheer audience. Wikipedia derives a certain degree of authority and trust in the mind of the reader by avoiding original research and citing sources. And for whatever flaws there are in the model — you pick up the newspaper once a week and you’ll see some horrible thing from Wikipedia — if you read it generally throughout, you’ll find it’s pretty good; and you’ll recognize that there is a process here that does a pretty good job of getting at something important.”

The massive energy generated by that sense of the project’s importance is certainly alive in Rosenthal. “For me,” he said, “it really comes down to the advancement of human knowledge. I see heaps of praise being put on Google and Yahoo and Microsoft for being different, and they’re all just search engines. It’s nothing different. But this is something that’s never been attempted before. It’s just so unconventional, and it works. And it’s completely for the betterment of humanity. There’s no downside to it. I think that’s incredible. And I don’t want to see people take this thing that is 100 percent beneficial by itself and turn it into something negative by putting on whatever vandalism they like, or libeling someone, or whatever the effect is.”

For something alternately heralded and feared as a harbinger of some brave new world, Wikipedia has a lot of old-fashioned trappings; in fact, within its borders it generates its own special brand of kitsch. Users get news about the site via a mocked-up newspaper called The Wikipedia Signpost; the series of high-tech discussion forums on which the admins communicate is called the Village Pump, and a forum for Wikinews is dubbed the Water Cooler. And users salute their peers, on a purely informal basis, by awarding one another special citations called Barnstars — a term derived from the decorative good-luck charms used to commemorate successful barn-raisings two or three centuries ago. “For being kind and simple in the face of my very stupid mistake,” reads one of the citations on Martin’s own user page, “I hereby award you this Random Acts of Kindness Barnstar :).” It’s enough to set your teeth on edge.

But the kitsch is also a key to something, because in the end what’s most encouraging about Wikipedia isn’t all that’s new about it but all that isn’t. It is in some ways a remarkably old-school enterprise; for one thing, it is centered almost entirely on the carefully written word.

“The classic question I get at conferences,” Wales said, “is, ‘Do you think Wikipedia will remain text, or will it be more and more video in the future?’ I think it’s pretty hard to beat written words. Especially for collaboration, because words are the most fluid medium for shaping and reshaping and collaboratively negotiating something. It’s kind of hard to do with video, and I don’t think that’s just a technical barrier.”

And then there is the notion of the neutral point of view. It’s easy to forget how far out of fashion that idea has fallen, particularly in the Wild West milieu of the Internet. The N.P.O.V. is one of Wikipedia’s Five Pillars. When asked why that neutrality is something whose value they’ve internalized so deeply, some of the admins I talked to used a rather neutral word themselves: information freed from opinion, they said, is “useful,” where information burdened by it is not. But it doesn’t take much digging to see that the question has a moral component as well.

There was, of course, already a Jerry Falwell article on Wikipedia, but the day of his death in May saw a predictable spike in traffic. When I first logged on, I didn’t have to scroll far before coming across an obvious bit of teenage vandalism, concerning an unprintable cause of death that the writer evidently felt would, if true, have meted out a certain poetic justice. That bit of editorializing, a matter of fewer than a dozen words in all, was gone from the page in two minutes.

“I’m actually surprised it took that long,” Sean Barrett, a Los Angeles-based I.T. consultant and Wikipedia admin, told me. “I went to the Falwell page myself as soon as I heard that he was dead; high-profile things like that, breaking news, we’ve learned to be proactive. I’m sure hundreds of administrators put Jerry Falwell on their watch list.”

But it wasn’t just the longtime admins who were hashing out the complexities of how to give this polarizing figure his neutral due. A furious dialogue went on all day on the discussion page that shadows that (and every) Wikipedia entry; one comment from a user named Shreveport Paranormal read, in part: “Despite my personal dislike of him, the man did just pass away. . . . This is a place that is supposed to give accurate information. . . . The only way we can keep to the purpose of Wikipedia is to remain unbiased.” Not that extraordinary a sentiment, perhaps, until you take into account that Shreveport Paranormal (according to his user profile) is a teenager, and Roman Catholic, and gay. And that he had been a Wikipedia user, at that point, for two days.

Wikipedia may not exactly be a font of truth, but it does go against the current of what has happened to the notion of truth. The easy global dissemination of, well, everything has generated a D.I.Y. culture of proud subjectivity, a culture that has spread even to relatively traditional forms like television — as in the ascent of advocates like Lou Dobbs or Bill O’Reilly, whose appeal lies precisely in their subjectivity even as they name-check “neutrality” to cover all sorts of journalistic sins. But the Wikipedians, most of them born in the information age, have tasked themselves with weeding that subjectivity not just out of one another’s discourse but also out of their own. They may not be able to do any actual reporting from their bedrooms or dorm rooms or hotel rooms, but they can police bias, and they do it with a passion that’s no less impressive for its occasional excess of piety. Who taught them this? It’s a mystery; but they are teaching it to one another.