Call of Duty Implements AI To Moderate Toxic Voice Chat

AI has begun policing Call of Duty's in-game voice chat.

Call of Duty is implementing AI to monitor and moderate in-game voice chat. Call of Duty is pretty well-known for having some very, very toxic lobbies. It's more or less part of the game's brand and reputation, whether Activision wanted it that way or not. When you have such a massive fan base and one that skews a variety of age ranges in a very competitive game, things are bound to get pretty heated. Over the last few years, Activision has been trying to correct this by enforcing stricter moderation standards. Now, they're taking a pretty huge step to try and come down on toxic game chat.

Call of Duty implemented a new tool known as ToxMod, an AI-Powered voice chat moderation technology from Modulate, to Call of Duty: Modern Warfare 2 and Warzone this week. It's in beta in North America right now with the intent of monitoring and enforcing against toxic voice chat in real-time. It will be monitoring for things including hate speech, discriminatory language, harassment and more. The AI will also be implemented in Modern Warfare 3 when it launches on November 10th.

"There's no place for disruptive behavior or harassment in games ever. Tackling disruptive voice chat particularly has long been an extraordinary challenge across gaming. With this collaboration, we are now bringing Modulate's state of the art machine learning technology that can scale in real-time for a global level of enforcement," said Michael Vance, Chief Technology Officer, Activision. "This is a critical step forward to creating and maintaining a fun, fair and welcoming experience for all players."

"We're enormously excited to team with Activision to push forward the cutting edge of trust and safety," said Mike Pappas, CEO at Modulate. "This is a big step forward in supporting a player community the size and scale of Call of Duty, and further reinforces Activision's ongoing commitment to lead in this effort."

Some players are concerned that the automated nature of this system could lead to accounts being wrongfully or falsely punished. Whether or not the AI can discern if someone is being playful or can mishear words/context and then punish someone is unclear, but we'll likely see its effects in the coming weeks as we get closer to Modern Warfare 3.