Streamers Face Bigger Problems Than We Know, and Their Platforms Need to Help

How platforms like Twitch and YouTube can better protect streamers and their mods.

Like a lot of streamers, Ceddy’s mod team protects him and his viewers from online harassment — trolls, haters and detractors. Ceddy, who goes by his first name online, described how, almost daily, commenters on his Twitch streams remind him he’s LGBTQIA+, and not in a kind way. Coupled with Ceddy’s prominence as a Malayasian creator, he and his mods frequently see racist and homophobic comments, and his mod team does their best to stamp out the bigoted language. But it’s a task that largely falls to Ceddy and his team, rather than the platform they’re working on. It can be emotionally draining work, and because they’re critical to blocking toxic chat, and because they spend so much time together, Ceddy thinks of his six-person team as family. He often takes them out for dinner or provides them with free games.

“It’s the least I can do,” Ceddy said.

Ceddy, who has just under 18,000 Twitch followers, says those perks are what he’s able to offer his mods in lieu of an actual salary, and he’s far from the only creator who can’t pay his team a proper wage. According to interviews with five streamers and another dozen who completed a survey about the industry, the vast majority of moderators — users who delete comments from streams, drop helpful links and keep chats active — are not paid. Some mods are also under the age of 18, toeing the line on child labor. So, in an industry only continuing to grow, especially as the last year of the pandemic drove record growth in both those watching and streaming themselves, who should pay these mods, and who should be held accountable when they face potentially traumatizing work day after day?

Streamers or Platforms - Where Can Change Be Made?

Mods work with and are chosen directly by streamers, but it’s hard to imagine putting the onus of answering those questions completely on streamers, many of whom can’t stream full time and barely make ends meet by streaming long, odd hours anyway. Instead, some streamers and moderators say platforms like Twitch, YouTube and Discord could provide more resources like human mod teams for special events, more API support and in-platform mental health education to better protect marginalized creators and their teams. The streamers we spoke to acknowledge that extra support, be it human or bot, would take a lot of work and fundamental platform changes. When asked for comment, Twitch did point to online mental health resources and their terms of service, but as the streaming industry develops and makes new careers possible, forward-thinking ideas are needed to address creators’ daily problems and prevent those problems from escalating in the future.

“Offering free tools that can help moderators in ways streamers may not have considered could be beneficial to communities across the board,” said Jeff Brutlag, an LGBTQ+ streamer and writer with 9,000 Twitch followers,

On gaming channels, the moderator selection process usually happens a couple of different ways. Some creators have applications; others work with mods who’ve been a part of their community for a while and are considered trusted friends. KittyPlays, who has just more than 1 million Twitch subscribers and more than 650,000 YouTube subscribers, told IGN this latter method is her approach, and that she also makes long-term members of her community moderators as a reward. She does not pay all of her mods but does compensate those who also edit videos for her.

“Mods are typically active community members who have been part of my community for a long time and built trust with me,” KittyPlays said in an email statement. “I also mod community members who have been subbed for 69 months as a reward. It’s wild that I have people in my community who have hit that benchmark.”

Of the dozen moderators and streamers who filled out an anonymous survey, only two said they are mods who are compensated or streamers who compensate their moderators with pay. Other streamers follow compensation structures like KittyPlays, paying moderators who also edit videos. Those surveyed only represent a small slice of the 9.7 million monthly unique creators on Twitch or the 40 million active YouTube gaming channels, but they are emblematic of how these creators run the gamut in audience size but still require mod help. Creators and moderators interviewed had audiences ranging between four Twitch followers and more than one million.
"I would need to be making a ton of money to pay someone to moderate for me."

Brutlag said they talk to other streamers in their circle and aren't aware of any who pay their moderators. They’ve heard of steamers who consistently have thousands of followers per stream that can pay their moderators, but the general consensus among creators they speak with is that it’s not normally possible.

Tanya DePass, better known as Cypheroftyr, a streamer with around 17,500 Twitch followers, did offer benchmarks and educated guesses based on her streaming history as to how many subscribers it would take to sustain a paid mod team. She says it would likely take around 1,500 to 2,000 subscribers to make ends meet and stream full time, and that she would probably need around 3,500 paying Twitch subscribers to pay a mod team a liveable hourly wage of around $15-$20 per hour. DePass noted that because she doesn’t have children or pets, other streamers’ situations could be completely different and require a vastly larger subscriber base to make ends meet and pay for healthcare. She also says she lives in the Midwest, which could have a lower cost of living than a streamer living in cities like New York, Los Angeles, or other places with higher costs of living..

The lack of industry benchmarks and the guesswork is, unfortunately, a running theme among the creators IGN spoke with. It's hard to say exactly where the threshold for being able to compensate hard-working moderators begins, and is likely also dependent on other factors in streamers’ lives. That’s, after all, why so much of it ends up being guesswork - because the platforms themselves don’t offer any clear plan or path of support for transforming streaming into a full-time business. Almost every survey respondent said they’d need a lot more money and paying subscribers to afford to pay moderators.

“I would need to be making a ton of money to pay someone to moderate for me,” one respondent said anonymously.

Appropriately Aged Moderators

Of those unpaid mod teams, some are minors. Some creators, like Ceddy and Brutlag, say they don’t work with mods under 18 because of mature language or content. Of the streamers and moderators surveyed, two said they have also worked with or are under 18. Of two other moderators interviewed for this story, one is a minor.

As with KittyPlays and Ceddy’s teams, many moderators join communities because they enjoy the work and time spent with the followers of a content creator. And no matter what, dealing with hateful and troubling comments is unfortunately a common occurrence. Justin Moore, an LGBTQ+ streamer with 20,000 followers on Twitch, says he’s so used to extreme hate he’s desensitized to it. This is also part of the reason he prefers to work with moderators over the age of 18.

“You get to a point where you have a thick skin, but what concerns me is the safety of my community,” Moore said.
"You get to a point where you have a thick skin, but what concerns me is the safety of my community."

In addition to horrific comments, Ceddy says it’s also common for viewers to express struggles with mental health that feel extreme, even announcing suicidal ideations in live chats. Ceddy offered one example in which a viewer shared a plan to take their own life, including a description of how they would do it. Moore and Ceddy both direct viewers to mental health resources because they aren’t trained professionals, but that doesn’t negate the impact on moderators, viewers or creators who’ve already seen the content.

And when that moderation work of dealing with extreme hate and mental health crises is coupled with those moderators potentially being under the age of 18, Psychotherapist Haley Neidich, LCSW explained how a moderator’s mental health must come first. She said that while it’s impossible to have expectations about screen time for kids because of the pandemic and online schooling, there are negative consequences for spending too much time online. Neidich acknowledges the importance of using the internet for maintaining grades, school, and personal relationships and friendships, but says kids should not be serving as moderators for streamers.

“I would call upon creators and game streamers to take responsibility for making sure their moderators are over 18,” Neidich says. “This is something that I've seen contributing to depression and insomnia in several adolescent clients.”

She says minors can still participate in communities they love more casually by showing up to streams and hanging out in chat. However, she warns that teens who are regularly subjected to the kind of emotional labor required of moderators are more likely to develop mental health problems. If teens are already moderators, Neidich says they should try to limit their activity to one hour per week and talk to adults in their lives to process their experiences, especially when hateful language is used. If a teen ever feels uncomfortable while working with a streamer, they should stop and talk to a trusted adult right away. Neidich says it’s also important to remember that online hate can negatively impact moderators at any age and says there’s a second pandemic happening around the globe.

“Kids are relying on screens for education, social contact, family contact and games, but the data shows that kids who spend more time online are at an increased risk for developing mental health issues and substance abuse problems,” Neidich said. “There is a mental health [pandemic] happening around the world right now.”

How Platforms Can Better Protect Creators

If streamers aren’t making enough money to compensate moderators and the mental health side effects can be serious for moderators of all ages, especially those in marginalized communities, how can they be protected? Some creators say they don’t feel the major streaming platforms are keeping pace with the industry. They want more resources that support smaller creators, protect them from hate, and make moderation easier.

Brian Gray, a Black LGBTQ+ streamer known as urbanbohemian, says Twitch couldn’t have known it would morph into a broadcasting platform where creators share cooking shows, cosplay, live music and more, but that it’s important for the platform to adapt to what it’s become.

DePass, a Black streamer who experiences hateful comments and racism during streams, says being featured on the Twitch homepage can cause additional harassment for marginalized creators. DePass has been featured in the “hero slot,” or one of the main Twitch homepage slots, for various celebrations, like a previous Black History Month. And while the hero slot naturally brings more viewers, the end result isn’t always positive.

“You have to use every tool Twitch gives you and then some extras to keep yourself safe,” DePass said.

DePass says Twitch gives featured creators a sort of one-page guide to help moderate the hundreds, or sometimes thousands, of additional viewers. But mostly she told IGN it feels like it isn’t enough, and marginalized streamers carry the bulk of the burden. Gray echoed that sentiment, saying it’s important to acknowledge what Twitch is doing to support creators, but thinks it’ll take creativity and forward thinking to solve big problems.

“I am giving them a little bit of credit, but that doesn’t excuse them from not trying,” Gray said. “From not hiring futurists and visionaries, asking themselves where they want to go. They can’t rely on us to define where their content goes.”
"You have to use every tool Twitch gives you and then some extras to keep yourself safe."

Gray says streaming is rarely a full-time job and more like being a freelancer or an independent contractor, recalling the way ride-sharing tech companies were forced to redefine their relationships with drivers. In October 2020, a California court ruled Uber and Lyft must consider driver employees, not independent contractors, earning them benefits like paid overtime and health insurance. A subsequent amendment in November of 2020 all but nullified the win, exempting Uber and Lyft from the new law.

“The passage of Proposition 22 is a huge win for gig economy companies that spent more than $200 million to aid its passage,” The Hill reported at the time, demonstrating the alarming amount of money spent fighting the chance to give employees stable benefits and pay rather than actually putting that money toward employees.

In discussing the ways platforms could take more responsibility for the jobs moderators and streamers do, some said there are tools that could be added to make streamers’ lives easier. In the end, Gray explained that it comes down to platforms sharing the responsibility with streamers, who currently have to protect and manage every aspect of their streams and live chats.

“A lot falls back on the streamer to protect themselves, to manage themselves,” Gray said. “I do feel like some of that responsibility should be shared by the platforms. No one tool is enough right now.”

Gray wasn’t alone in that belief: the creators and moderators we spoke to mentioned three main kinds of resources platforms could more frequently provide: training and education, expanded chat bot support, and human moderator teams for special occasions.

While platforms like Discord do offer a moderator academy that outlines best practices, Gray noted this type of information is often hard to find online and should be more visible. Moderator Aerin Night, meanwhile, outlined how the other two moderator solutions might be put into practice.

Night said Twitch and similar platforms could provide moderator teams for big or important partnerships, such as when a large company wants to host a special stream. In-house teams would mean the partner company wouldn’t have to hire, train and prepare an entire team, alleviating some of the stress on them while also ensuring the moderators have been trained and understand what types of comments they may have to contend with. Night also said platforms must be more responsible about featuring marginalized creators on their home page, because they aren’t prepared for the target it puts on their backs.

“Smaller streamers who end up in featured positions are often ill-equipped to deal with the massive influx of new viewers,” Night said. “We saw this recently with white supremacists flooding the chat of Black creators who were featured.”Night said giving these streamers additional human mods while they're on the front page could help alleviate the stress of instantly larger audiences significantly. She also explained that a well-maintained API, or Application Programming Interface, is more doable at scale because developers and creators are making their own chat and moderation bots already, like Nightbot. Nighbot is popular on both YouTube and Twitch and automatically deletes some phrases, drops links when prompted and makes moderation smoother. A platform’s API basically allows public users to create these tools, almost like a modification. Night says that right now, Twitch’s API is occasionally broken and could be better if expanded.

“Twitch and YouTube will always be too big and too slow to try all the ideas that fly around their platforms,” Night said. “Often the community figures out what it wants and what will work weeks to years before Twitch will, and [will] build that proactively—if they have access to an API.”

Representatives from YouTube and Discord were not able to comment.

Twitch, meanwhile, said in a statement that safety is a top priority for users and highlighted mental health resources the platform created for streamers and moderators. The platform encourages users to report problems as they arise and make their community guidelines for reporting public.

“Harassment of any kind is unacceptable and has no place on Twitch, and we take steps to enforce against this type of behavior when it is reported to us,” a Twitch spokesperson said.

And though the platform owners may be taking steps, our survey of moderators and streamers clearly points to a situation where those steps aren’t happening quickly enough, or sometimes not at all.

“I do things for my mods if I can, but there is no established structure for a streamer to compensate them,” Gray said.

The streaming industry isn’t currently set up to pay moderators, some of whom put many hours of volunteer work into keeping online communities safe and active. For some, they’re also put in harm’s way daily, and the tools currently available don’t always do enough to protect them according to those we spoke with. There are at least a few ways platforms could take some of the burden from creators, supporting their teams and offering more free resources that protect marginalized people. Until then, many streamers seem at a loss for how to solve these problems that are at the foundation of the work they produce.