Call of Duty Turns to AI to Stop Players Using Profanities In-Game
Activision hopes AI can clean up Call of Duty voice chat and curb toxic behavior in the popular shooter franchise.
At a Glance
- Call of Duty Modern Warfare III to feature AI tool ToxMod to detect and filter toxic language and harassment in voice chat.
Call of Duty developer Activision is hoping to cut out foul language and toxic behavior from players using AI.
The team behind the game announced that the latest installment, Call of Duty: Modern Warfare III, releasing this November, will make use of ToxMod, the AI-powered voice chat moderation tool from Modulate.
Pre-game lobbies in Call of Duty are where players wait before a game loads up. Here, you can edit your weapon classes. Historically, it's been where players shout slurs at one another. Even after Call of Duty dropped the traditional pre-game lobby, matches themselves often feature players swearing at one another – especially in a competitive game mode like the popular Warzone.
ToxMod will identify toxic speech in real time – including hate speech, discriminatory language and harassment.
An initial beta rollout of the voice chat moderation technology has begun in North America for existing titles - Call of Duty: Modern Warfare II and Call of Duty: Warzone. A full worldwide release excluding Asia will occur on the release of MWIII on Nov. 10.
ToxMod support will start in English with additional languages to follow at a later date.
Call of Duty already employs an anti-toxicity team which includes text-based filtering across 14 languages for in-game text like chat as well as usernames.
The game developers said that since the launch of Modern Warfare II, Call of Duty’s existing anti-toxicity moderation has detected over one million accounts found to have violated its Code of Conduct.
Once the Call of Duty team acted against those players, they found that 20% did not reoffend. They issued harsher punishments to those who did.
About the Author(s)
You May Also Like