Valorant Team Commits to "Harsher Punishments" for Toxic Players

Valorant's developers are about to get tougher on those who violate the game's community guidelines and policies, the creators announced this week. A lengthy post on the game's site detailed how the devs plan to enact these new policies with "harsher punishments" for the game's existing systems that moderate conduct as well as the creation of a system where "Player Support agents" look over incoming reports. A voice chat moderation beta will also come at some point this year following last year's announcement, Riot said.

The whole Valorant post is worth reading through if you're a player who's got questions and concerns about how others conduct themselves in the game, but some of the takeaways started with the commitment to harsher punishments. Riot said some of its punishments have been more "conservative," but that's changing.

"Generally harsher punishments for existing systems: For some of the existing systems today to detect and moderate toxicity, we've spent some time at a more 'conservative' level while we gathered data (to make sure we weren't detecting incorrectly)," Riot said. "We feel a lot more confident in these detections, so we'll begin to gradually increase the severity and escalation of these penalties. It should result in quicker treatment of bad actors."

The Player Support system mentioned previously was rolled out in Turkey and could be released elsewhere if it produces promising results.

"The long and short of it is to create a reporting line with Player Support agents—who will oversee incoming reports strictly dedicated to player behavior—and take action based on established guidelines," Riot said. "Consider this very beta, but if it shows enough promise, a version of it could potentially spread to other regions."

As for the voice chat moderation discussed last year, the plan is to launch that in a beta form this year. That new system would make it so that voice chats would be recorded and reviewed by Riot should someone be reported for harassment or other violations that occurred over voice chat.

"As of now, we are targeting a beta launch of the voice evaluation system in North America/English-only later this year to start, then we will move into a more global solution once we feel like we've got the tech in a good place to broaden those horizons," Riot said. "Please note that this will be an initial attempt at piloting a new idea leveraging brand new tech that is being developed, so the feature may take some time to bake and become an effective tool to use in our arsenal."

The full post about the Valorant team's latest efforts against toxicity can be seen here.