Skip to main content Skip to navigation

Facebook and Twitter ‘should use volunteer moderators’ says Wikipedia founder

Jimmy Wales says social media companies should follow the digital encyclopedia’s approach

Jimmy Wales
Jimmy Wales said: ‘One of the solutions that these companies should be looking at is how do we put more decision-making power into the hands of the thoughtful people in our community.’ Photograph: Arnd Wiegmann/Reuters
Jimmy Wales said: ‘One of the solutions that these companies should be looking at is how do we put more decision-making power into the hands of the thoughtful people in our community.’ Photograph: Arnd Wiegmann/Reuters
Global technology editor

First published on Thu 23 Sep 2021 19.00 BST

Facebook and Twitter should adopt Wikipedia’s approach to battling online abuse and misinformation by deploying thousands of volunteer moderators to monitor controversial posts, according to the digital encyclopedia’s founder.

Jimmy Wales said the scale of the problem facing social media companies was underlined when he had to personally ask Twitter’s chief executive, Jack Dorsey, to deal with a particularly vicious online troll, after the company’s initial response was to do nothing.

“If you are failing me, you are definitely failing a teenager who is being abused by someone [online],” he said.

Wikipedia, which launched in 2001, uses volunteer editors to oversee its entries, including a specialist medical group that moderates its health and medicine entries.

Speaking to MPs and peers on the joint committee of the draft online safety bill, which will place a duty of care on social media companies to protect users from harmful content, Wales said Wikipedia had up to 5,000 “extremely active” volunteer editors out of a total of about 80,000 active editors. Wikipedia receives around 2bn visits a month and its pages are edited every 1.9 seconds.

He said Wikipedia’s approach could work for social media platforms. Facebook and Twitter have been urged to better protect users from online abuse, with Twitter in particular facing criticism over racist posts directed at England footballers during this summer’s European football championship.

“I do think that one of the solutions that these companies should be looking at is how do we put more decision-making power into the hands of the thoughtful people in our community,” Wales said. “So if we recognise there is a problem with troll groups, well, who’s out there on the frontline fighting them … If it’s just the company and the company’s algorithms, well you are losing that battle, clearly.”

Wales admitted that giving “trusted users” the power to block and moderate fellow users posed problems, but social media companies should consider devolving power over their platforms as a solution to abuse.

Meanwhile, Twitter updated users on its plans to address abuse on the platform on Thursday. The company said it is exploring “new ways” to filter unwanted words and emojis from replies to users and will soon test a “heads up” feature that will warn people if they are about to join a heated online conversation.

Twitter is also planning a feature that allows users to remove themselves quietly from a thread in which they are tagged.

Wales added that he was concerned that the bill focused too much on content and not the algorithms at social media companies that turn a hateful user into someone with “5,000 or 10,000 followers”. He added: “There is a concern about this bill’s focus on content rather than algorithms.”

However, the Department for Culture, Media and Sport has argued that the draft bill already provides the communications regulator, Ofcom, with sufficient powers to scrutinise algorithms.

Facebook and Twitter declined to comment, but both companies have historically defended their approach to moderation in the face of criticism. Facebook employs 40,000 people in its global safety and security team, which includes moderators, while Twitter’s trust and safety council draws on advice from experts and NGOs.