The VALORANT team at Riot Games is pleased with the early success of its new chat moderation feature and is now beginning to roll out the new system to the worldwide player base in an attempt to quell in-game toxicity.
The devs announced today they are beginning the “global rollout” of the “Real Time Text Evaluation and intervention system” after a lengthy test period solely in North America. The team is confident following the test run that the system will be successful in other regions and across other languages, and will start reaching other regions via a mid-patch update followed by a staggered release.
The new system will automatically detect “disruptive” text chat messages and instantly mute the text chat and voice chat of the players sending the messages. The muted players will likely not even know that they are muted, but the teammates they are disrupting will be able to continue playing with less chat annoyance.
This new change is part of a long-standing effort by Riot to reduce toxicity across all of its titles. Riot first started recording voice comms to detect inappropriate language and harassment back in April 2021.
But the VALORANT player base is looking for more than just improved detection methods. After Riot promised stricter penalties to League of Legends griefers in May, VALORANT players asked for the same. The new League penalties Riot promised included disabling ranked queue, separate queues for repeat offenders, and completely disabling all game modes for the worst offenders.
Voice chat continues to be a problem in VALORANT, with many players feeling like the number of verified reports isn’t enough to properly quell toxicity. There are also still instances of throwing and sabotaging, griefing without communication, and going AFK. But this is a welcome step forward in the still relatively young lifespan of VALORANT.
Published: Jul 26, 2023 10:29 am