Twitch is updating its appeals and reporting processes, following through on product updates the company promised late last year. The biggest change is the introduction of a new portal where users facing account suspensions can appeal those rulings and monitor the progress of their requests.
While Twitch deals with its fair share of users with bad intentions, the company wants to provide more clarity and consistency for users who accidentally break the rules in the process of livestreaming.
The new appeals hub allows users to view enforcement actions eligible for appeal starting today at appeals.twitch.tv. The portal will usually be limited to bans enforced within the last 60 days, but the company says that anyone facing an indefinite suspension that went into effect before that interval will also be able to request to have an eligible case reviewed there. The new appeals tool will also appear in the profile menu under the “safety” section, where it will remain available even for users who are currently suspended.
Users appealing Twitch’s moderation decisions will still receive an update over email when an appeal is accepted or rejected. The company notes that those decisions will now be accompanied by more detail on its thinking, though that still won’t include how many “strikes” an account has because enforcement decisions take context and severity — not just the quantity of violations — into account. Twitch also said that it still plans to attach relevant video clips to its emailed enforcement notices, but that particular feature is still in the works.
“Sometimes we get it wrong, which is why the appeals process is so important,” Twitch wrote in the announcement. “We’ve heard that our current system is slow, and that it doesn’t provide enough insight about how your current appeal is going or how past appeals have gone. This is particularly important for creators, who earn income from streaming for their communities.”
Twitch is also making updates to its reporting system, inviting users to search for the reason they are flagging content and offering customized menus depending on if reported content appears in a livestream, VOD or a clip. The changes will appear first on the web version of Twitch, rolling out to mobile afterward, with a global rollout complete within the “next few months,” according to the company.
While all major social platforms continue to grapple with the complexity of content moderation at scale, the fact that most content on Twitch is streamed live adds an additional layer of challenge. Twitch relies less heavily on automated content moderation systems than some of its peers, leaning instead on human review teams that prioritize speed due to the realtime nature of the vast majority of its content.
Twitch VP of Global Trust and Safety Angela Hession previewed today’s updates late last year in a blog post that reviewed the platform’s content moderation policies and safety tools. Twitch viewership exploded during the pandemic and the company has been scaling up its content moderation efforts accordingly.
Though it declined to provide specific numbers to TechCrunch, Twitch says it has “quadrupled” the number of moderators reviewing user reports over the last two years. The company notes that it responds to more than 80 percent of reports within ten minutes.
“As our community has grown and become more global over the past few years, we’ve continued to scale and streamline our operations accordingly, while still prioritizing keeping a human in the loop in all aspects of moderation,” Twitch VP of Global Safety Operations Rob Lewington said.
Beyond its everyday enforcement decisions, the company attracts a lot of attention for enacting temporary suspensions against some of its most popular streamers. In December, political streamer HasanAbi was hit with a seven day ban after using the word “cracker,” which Twitch apparently considers to be a legitimate anti-white slur. Other star streamers have been banned for everything from sexy yoga poses to streaming Avatar: The Last Airbender.
Meanwhile, Twitch continues to wrestle with hate raids, the practice of flooding a streamer’s channel with harassment en masse. Those attacks often target marginalized Twitch streamers, further driving those voices off a platform where Black, LGBTQ and female streamers already struggle to gain a foothold.
In November, Twitch introduced a new automated tool that detects accounts trying to get around channel-level bans. Channel bans are one of the main ways that Twitch streamers and moderators can control who can interact within a community, but ban evaders render that otherwise powerful option ineffectual.
Late last year, Twitch took the extraordinary step of suing two “highly motivated” users for organizing thousands of bot accounts into hate raids, though the company did not have their real identities at the time. “This Complaint is by no means the only action we’ve taken to address targeted attacks, nor will it be the last,” Twitch wrote in the suit.