AI joins the warfront: Call of Duty employs artificial intelligence to curb toxic chat
If you've played any Call of Duty title online recently, you've likely experienced some frustrations with the multiplayer experience. It's not just the swathes of cheaters taking the joy out of the game. The real issue in competitive online gaming comes from toxic players who either insult others due to their own inability to accept defeat or who tarnish the overall gaming atmosphere with offensive comments.
Game developer Activision has made considerable efforts to improve player experience in the past few months. For instance, they incorporated an initiative to ban cheaters in Call of Duty. Now, the company is tackling the equally critical issue of toxic behavior. In collaboration with Modulate, developers of the ToxMod technology, Activision will use AI to monitor and moderate voice chats in real-time. The system can detect and suppress toxic speech, including hate speech, harassment, abuse, and discrimination. It can filter text in 14 languages (both chat and usernames) and facilitate the reporting of unsporting players while the game is being played.
This innovative system has already been incorporated into Call of Duty: Modern Warfare 2 and Call of Duty: Warzone as of August 30. Despite remaining in its beta phase and being limited to the North American region, developers plan to extend the technology to all regions around the launch of Call of Duty: Modern Warfare 3 on November 10. Initially, the system will only cater to the English language, but support for additional languages will be gradually incorporated. In short, if you plan to participate in the online modes of Call of Duty: Modern Warfare III, be aware that you're being monitored. It will be interesting to see how else game developers plan to exploit artificial intelligence in the future.