Wikipedia has hundreds of millions of users worldwide. (Nelson Ching/BLOOMBERG)
Noam Cohen is the author of “The Know-It-Alls: The Rise of Silicon Valley as a Political Powerhouse and Social Wrecking Ball.”

Conspiracy videos are top draws on YouTube, whether the subject is the moon landing (humans never set foot on it) or the Parkland school massacre survivors (they’re “crisis actors”). Last month, Susan Wojcicki, YouTube’s chief executive, proposed an antidote to these poison pills: Wikipedia. For each fanciful conspiracy peddled on YouTube, she said, there soon would be a link to a corresponding article from Wikipedia.

Is this the same Wikipedia once ridiculed as the encyclopedia “anyone can edit” and later feared as a prime example of how the Internet was empowering a tyranny of the masses to overwhelm genuine expertise? Today’s Wikipedia is much more likely to be venerated as evidence of “the wisdom of the crowds,” a less-than-shocking concept now that we have become accustomed to seeking out and relying on the contributions of relative strangers, whether that’s Amazon product reviews, Instagram photos, breaking-news updates from Twitter or Facebook eyewitnesses.

YouTube’s reliance on Wikipedia to set the record straight builds on the thinking of another fact-challenged platform, the Facebook social network, which announced last year that Wikipedia would help its users root out “fake news.” The Facebook plan, which began in the United States this past week, cleverly works on a meta level — Wikipedia won’t be used to rebut false news accounts, as YouTube envisions, but to give background information about the organizations spreading the news.

Clearly, Wikipedia has achieved a new trusted status. In a critical way, it swapped places with Google, YouTube’s parent, and now serves as the “good cop” of the Internet. An era overrun by false narratives fueled by cynical businesses and platforms seeking the greatest audience now asks the masses to come to the rescue.

Since Wikipedia was founded in 2001, only a few years after Google, the two organizations have been intertwined. When the creators of Wikipedia introduced their novel system for producing an online encyclopedia from scratch — especially the boldly ingenious idea of allowing anyone to create and edit the articles published on the site — many chose to look away. The one institution all in from the start was Google, which immediately recognized and respected the influence that Wikipedia was having on the Web.

This was Google’s singular insight as a new search engine. Instead of following the lead of other search engines, which would scan the Web for a term and count how often it appeared on a site to assess its relevance, Google relied on an algorithm trained to listen to the Web. If a site was widely linked to by other sites, the Google algorithm deemed it more relevant to a search and ranked it higher on its results than a site that was respected offline but largely ignored online. Wikipedia flourished while traditional, well-respected print publications, not least Encyclopedia Britannica, languished.

“We believe that we are, happily, in a positive feedback loop with Google,” Wikipedia co-founder Larry Sanger bragged on Jan. 16, 2002, when the site was exactly a year and a day old. Google sends traffic to the site, he explained, which increases the number of contributors, who write more articles for Google to link to. “All the while, no doubt in part due to links to our articles from Google, an increasing number of other websites link to Wikipedia, increasing the standing of Wikipedia pages in Google results.”

Back then, Wikipedia wasn’t the nonprofit organization it is today. It was part of Bomis, led by former commodities trader Jimmy Wales, one of a raft of Web 2.0 companies committed to harvesting the material produced by their users so they could surround it with advertising. Aided by Google, Wikipedia was growing spectacularly. A year in, Sanger announced that there were 20,000 articles. A fortune couldn’t be far behind.

Google, by contrast, was founded by two Stanford PhD students who opposed advertising on search engines — it was a form of corruption, in their view. In early 2002, the company was still resistant to accepting ads. The year before was the first it had turned a profit — a paltry $7 million.

But then the two organizations switched places. By 2004, Wikipedia swore off advertising completely after its community of volunteers threatened to take their contributions and create a separate site, something they were free to do because of the terms of the software used to create the articles. The project was converted into a nonprofit organization. Google, meanwhile, under pressure from investors to stop operating as a research lab, devised a plan for advertising that was immensely lucrative and managed, initially, not to intrude on search results.

Over the years, Wikipedia has been enormously helpful to Google because, in the words of Wales, “We make the Internet not suck.” The search engine believed in Wikipedia well before its articles were worthy of such belief, but as the encyclopedia has become increasingly reliable and trusted, Google has tied itself ever tighter to the billions of facts and figures published there. It extracts Wikipedia’s information for use on its own results pages, which is why it made a certain amount of sense when Wojcicki announced that YouTube would be doing something similar.

That Google, a company with a market capitalization of three-quarters of a trillion dollars, is enlisting a volunteer-created nonprofit organization as a bulwark for truth on its wildly popular video-sharing service is yet more evidence of how the Internet has become a hostage to the priorities of profit-obsessed, hyper-individualistic Silicon Valley companies. It’s almost as if YouTube has thrown up its hands at the notion of creating a civil, truth-respecting community, either because doing so is antithetical to its mission to maximize profit by maximizing use, or because it would require taking constructive steps rather than reacting to its vague policy against videos that contain “harmful or dangerous content, hateful content, threats.”

When Tim Berners-Lee conceived the Web, he imagined that it would look a lot like Wikipedia; that is, “a system in which sharing what you know or thought should be as easy as learning what someone else knew.” But the Berners-Lee vision of collaboration and self-publishing was left at the side of the road almost immediately with the viral popularity of the Mosaic browser, which was much easier to use and displayed images seamlessly. Collaborating to create new pages wouldn’t be integral to the Web experience, Berners-Lee wrote in his memoirs; instead, the Web would grow to become “a medium in which a few published and most browsed.”

Although it is hard to argue today that the Internet lacks for self-expression, what with self-publishing tools such as Twitter, Facebook and, yes, YouTube at the ready, it still betrays its roots as a passive, non-collaborative medium. What you create with those easy-to-use publishing tools is automatically licensed for use by for-profit companies, which retain a copy, and the emphasis is on personal expression, not collaboration. There is no YouTube community, but rather a Wild West where harassment and fever-dream conspiracies use up much of the oxygen. (The woman who shot three people at YouTube’s headquarters before killing herself on Tuesday was a prolific producer of videos, including ones that accused YouTube of a conspiracy to censor her work and deny her advertising revenue.)

Wikipedia, with its millions of articles created by hundreds of thousands of editors, is the exception. In the past 15 years, Wikipedia has built a system of collaboration and governance that, although hardly perfect, has been robust enough to endure these polarized times. It’s easy to make fun of Wikipedia for its incessant arguments about the nationality of Frédéric Chopin or what to call the Sea of Japan. And its priorities can seem upside down, with more attention given to “The Simpsons” than Jane Austen. Women and minorities are underrepresented. There are cliques that can be hostile to new people interested in contributing.

But fundamentally, from my perspective as an occasional Wikipedia contributor and frequent Wikipedia reader, the project gets the big questions right. (Of the roughly dozen articles I created, none have been deleted, some remain the stub that I created and are read by fewer than 20 people a month, others have been vastly improved and are read by hundreds of people a month.)

Google isn’t wrong to trust it. The medical pages are closely supervised to ensure that incorrect information doesn’t sneak in. Thorny political arguments are aired out exhaustively on talk pages, and Wikipedia has the advantage of enlisting dedicated editors living far from the region consumed by particular debates to come up with compromises. Principles include assuming good faith on the part of other Wikipedia editors, and adhering to a higher standard for articles about living people. Using bots, Wikipedia also quickly removes vandalism. The idea is to make trolling so futile that the trolls leave.

In a cruel twist, Google’s enthusiasm for Wikipedia poses one of the biggest threats to its continued vitality. By extracting detailed summaries of Wikipedia articles that show up on search result pages, Google is depriving Wikipedia of its own potential visitors — the positive feedback loop of 2002 is being severed so that Google can sell more screen time. The Wikimedia Foundation, which administers the project from San Francisco, wasn’t given a heads-up about YouTube’s plan. “It’s not polite to treat Wikipedia like an endlessly renewable resource with infinite free labor,” said Phoebe Ayers, an MIT librarian and a longtime Wikipedia editor. “What’s the impact?”

Sad to say, the only challenge bigger for an idealistic noncommercial project than being dismissed and ignored by the rapacious tech monopolies may be being noticed and respected by them.

Read more from Outlook and follow our updates on Facebook and Twitter.