U.K. regulation agency Ofcom has imposed new measures to police inappropriate video content across several platforms, including Twitch and TikTok. These sites and video platforms will face fines for failure to comply with the new guidelines and potentially a suspension of service.
Twitch will be expected to take “appropriate measures” to protect audiences from subject matter such as racism, terrorism, and child sexual assault, according to a BBC report. To avoid conflict with these new rules, sites will have to provide strict guidelines for content creators, show proper enforcement, and simplify their reporting systems. Sites with adult content will also be required to implement age-verification processes.
This new wave of regulations comes in light of a reported rise of abusive content. There’s been a 77-percent increase in abusive content generated since 2020, according to the Internet Watch Foundation. Ofcom also claims that a third of all users on sites like Twitch witness hateful content of some kind.
Twitch has consistently been criticized for the prevalence of abusive content on its site and the hateful conduct of its users. In the company’s transparency report last modified earlier this year, it was revealed that less than 15 percent of reports from Twitch users in 2020 actually led to action.
Twitch has taken recent precautions to prevent user harassment by implementing phone verifications for chatrooms to stop banned users from setting up multiple accounts. Many of these precautions came after the #ADayOffTwitch protest, which was a creator-led movement meant to raise awareness and demand solutions to the issues of harassment on the site.
The new Ofcom regulations will impact Twitch, TikTok, Snapchat, and Vimeo.
Published: Oct 6, 2021 12:11 pm