Design for Community essays


Buy the book!

Gaming the system: How moderation tools can backfire

Personally, I blame WarGames.

That 1983 movie firmly planted the idea of the curious hacker into the world consciousness at a time when personal computers were still new. Think back to Matthew Broderick, the poor guy, innocently mucking about in military computers. Because they were there. Almost causing Global Thermonuclear War.

When I got my first email account in 1991, I did the same thing in the green and black terminals of UC Santa Cruz. I typed in random commands, poking around the edges, looking for secrets. I never started a war, but I did manage to have playful anonymous conversations with random people until my interest in the game petered out.

The point is, mischievous curiosity is an element of any online interaction, even in this era, where computers are less mysterious than they used to be.


In Design for Community, I spent a lot of time talking about Slashdot's model of community moderation. It's a marvel of geek ingenuity, one which is usually very successful. Users can rate each other's contributions, earn "karma points" (the number that measures a user's contributions to the site), and filter each other into oblivion.

Since the book was written, Slashdot has added still more inventive features. A simple but effective reputation management system is in operation now, allowing members to list each other as a friend or foe. The ratings are even public, so you can see who's list you're on. And even better, you can apply a filter to your lists, rating all your friend's comments up, and your foes' down. It's now possible to make it so you never have to see a particular user's posts again: just list them as a foe, and set all foe posts to "-5." In a few clicks, they'll be off your radar forever.

Problem is, all this groovy functionality adds several layers of new interface elements. Every filter, rating, and setting means adding another button, dropdown, and submit button. It's easy to see a future, not very far away, when the site grows so interface-heavy it will scare off all but the most determined new users. While what might not slow down the rabid Slashdotters, it would certainly impede a new site with a fragile audience.

Worse, sometimes all the widgets backfire altogether, encouraging the very behavior they're designed to avert. Sometimes all the rules have a dangerous side-effect: they create a game.


Here's how it works. Every Slashdot member has a karma number. When that member does something that benefit the site (submitting a story that is accepted, posting a comment that gets moderated up, moderation other comments), the user's karma level increases. When that user does something that the site administrators want to discourage (posting something that gets moderated down, making moderations that are deemed unfair), the user's karma level decreases.

What do karma points matter? Not much, really. A user with a high karma level is more likely to be asked to moderate, but that's no prize. Members with a high karma level may also begin to post comments that start out with a high rating, which might mean that their comments will be seen more. But, in reality, if you post something thoughtful, chances are it will be moderated up anyway. Like I said, that part of the system works well.

So why do people care about their karma level at all? Easy. Slashdot asked them to.


When Slashdot's moderation system was originally introduced, karma levels were invisible. Everybody had one, and it influenced how the site interacted with you, but you could never know exactly how many karma points you had. Of course, once the Slashdotters heard about this mysterious karma number, everyone clamored to know what it was. It was just human nature. Soon the karma levels were disclosed to each user, with a caveat: only you could know how many points you had. You couldn't find out anyone else's.

But that was enough. Once you give a person a few points, they start to keep score. And that's when it becomes a game.

Soon members were actively trying to get more points. It wasn't even that hard to do: post something negative of Microsoft or positive of open source and watch the karma points roll in! A new term was quickly coined for the people who did this: "karma whore."

The Slashcode-powered site Plastic (discussed in Chapter 3) took this game to its logical extreme last year: a karma contest. They gave away an Amazon gift certificate for the member who earned the most karma points in a week. The competition was fierce, with some members gaining more than 100 karma points in seven days.


At this point you may be thinking: "So? People are participating! Isn't that the idea?" And you're right, gaming the system can be increase participation dramatically. But it's imperative to look at why people participate, as well as how much they do.

Let's talk about social capital. That's the buzzword for what we gain when we participate in community spaces. Usually, when I participate in a community, I earn something intangible, yet valuable. So what happens when I also earn something tangible? How does that change the process?

Like they say in the lawyer shows: It goes to motive. When I post to a community area, and get nothing more than social capital for doing so, you can usually assume that my motivation is primarily altruistic. I'm participating because I want to help the conversation along with a fact I know, or a viewpoint that was underrepresented, or just to propel the conversation forward. In short, why would I lie (or pander or obfuscate, etc.) when I have nothing to gain?

Sure, there can be darker motives – ego gratification, roll playing, sadistic trolling, mommy didn't love me boo-hoo – but all of those are part of human nature and can't be helped. My question is: Why add another one when you don't have to?

When you introduce a new reward, even one that seems as meaningless as an abstract number, you create a new motivation for participation. That motivator corrupts the process, especially when it's money.

When people shoot for the easy wins to earn karma points, it narrows the field of discussion and discourages minority views from being expressed. Members will self-censor, afraid of saying something that will be moderated down. (While the Slashdot ratings are supposed to be qualitative, we're all subjective creatures and there's no doubt that unpopular viewpoints are likely to be rated down. If you don't believe me, try defending Microsoft there.) Members now have an incentive to post fast and first, instead of taking their time and composing a thoughtful comment. All of which has a negative impact on the quality of conversation on the site.

When the noise levels go up, so do all the filters. Set yours at three or four and you'll be safe from the trolling. But you can bet the trolls surf the site without a filter, and in their view, they're everywhere. The overall effect is to create a sub-community that willfully and knowingly violates the rules of the site without any repercussions.

I assert that any time you manufacture a reason for people to participate in a community area, you corrupt the usual altruistic motivation for participation, in turn corrupting the content itself. Instead of trading in social capital, members begin to trade in a fake currency with no real bearing on the content of the community. When that happens, you invite your members to treat the community like a game.

If that's what you're going for, then I encourage you to game away! But if your goal is thoughtful, positive conversations, beware of adopting the qualities of a game.


The other problem with the Slashdot moderation model is endemic to the rating/filtering system itself. The ability for members to filter out nonsense posts has an interesting side-effect: it lessens the negative public reaction. When you remove this social feedback loop, you actually increase the troublemaker's ability to post nonsense with impunity. The result is a filtering system that encourages the very behavior it was designed to prevent.

It's arguable that this is a problem at all. Most people will not be aware of the increase in nonsense, since it'll be filtered out of their view. But for those who want to view everything, it's a real problem. It is allowing a behavior to flourish that's counter to the site's mission. It creates a sub-community that's able to hide in the light, with a total disregard for the rules of the site. This results in more and more users who are only there to make trouble, poking around the corners to find a hole in the system. Inevitably, they will.

This is all your fault, Matthew Broderick.

For an alternate approach, consider MetaFilter (discussed in Chapter 2). With thousands of members and millions of page views a day, it's certainly an incredibly active community. And yet the site has no ratings or filters. Hell, the pages don't even paginate! If a thread hits 300 comments, then that's exactly what you get: One page with 300 comments, posted first to last.

What do users do with no reputation management system? They do it the old fashioned way: socially. Post a link to MetaFilter that's been posted before and be prepared for a schoolyard taunt. Make a dumb comment in a thread and the next post is going to tell you so. And there's no filtering any of it away – the posts just collect, one after another, for all to see.

MetaFilter's creator, Matt Haughey, jokes that the site works on a system of "public shaming." And he's right – make a mistake once and you're unlikely to do it again. Instead of your errant post getting silently filtered away, it's up to the community to reinforce its own norms. It can be a brutal process, but not any more brutal that the accusations of karma whoring on Slashdot, and it comes with much less interface baggage.


For another take on the same problem, take Slashdot's kid sister, Kuro5hin (K5 for short). K5 was built from scratch by Rusty Foster, out of a desire to make a Slashdot that works better. Instead of karma, K5 has mojo, which is used to determine "trusted users" who get special privileges. But unlike Slashdot, K5ers never know their mojo level, though the do know when they become trusted. You can rate posts like Slashdot, but not filter them, so everything that gets posted can be seen like MetaFilter.

None of these systems are perfect, of course. MetaFilter's approach means that the system does little to reinforce the site's rules. It's left to the community to self-police, and that's a lot to expect from a group of people, especially if you haven't really reached critical mass. Not every community is going to be willing or able to manage it. And Slashdot's system may backfire, but it's also largely successful. Take a spin through the site with a filter set at zero and see how long you can last. The combination of moderator ratings and member filters usually separates the wheat from the chaff pretty well.

Still, it's important to remember this essential truth: Any complicated moderation system that makes its algorithms public is eventually going to fall victim to gaming. So my advice is, if you're going to use a community moderation system, make it as invisible as possible. No karma numbers, no contests, no bribes. Rely on social capital and quality content to get your community talking, and develop a system that helps you moderate without a lot of fanfare. The bottom line is, if you take away the scores, it's hard to play the game.

And like we learned at the end of WarGames: The only way to win the game, is not to play at all.

Like what you read? Buy the book!

« back home

recent essays

Gaming the system: How moderation tools can backfire
Sometimes all the widgets backfire, encouraging the very behavior they're designed to ...
2002.05.30

Community-friendly advertising
With text ads and sponsored sections, community sites are surviving by not ...
2002.04.19

Killing the biggest myth of web design
If you think that no one reads online, maybe it's because you're ...
2002.03.28

Seven sites doing it right
The funny thing about having a book out about virtual community is ...
2001.12.18

Why does Yahoo encourage racist groups?
The decisions you make when categorizing information can reveal your assumptions in ...
2001.10.03

Grief and hope connect us online
The unthinkable terrorist attacks of September 11 have may have taken place ...
2001.09.18

Getting real: Virtual communities that break the fourth wall
Of course there is a real world out there when you're engaged ...
2001.07.30

In search of technological solutions to netiquette problems
While it will always be the community's responsibility to create and pass ...
2001.07.24


search

Looking for something? Search all of DfC.com:

© Design for Community.