The company behind popular shooter video game Call of Duty is launching in-game voice chat moderation to tackle “toxic and disruptive” behaviour online.
Activision, which also publishes Crash Bandicoot and Tony Hawk games, said that the new moderation measures would begin with the launch of Call of Duty®: Modern Warfare® III on 10 November.
The gaming giant is partnering with technology firm Modulate to roll out the AI-powered real-time voice chat moderation.
Modulate uses advanced machine learning tech to flag bad behaviour online by analysing the “nuances of each conversation to determine toxicity”, including hate speech, discriminatory language, and harassment.
It says that its technology enables moderators to quickly respond to each incident by supplying relevant and accurate context.
Activision said that the technology would bolster its existing moderation systems, which include text-based filtering across 14 languages for in-game text and an in-game player reporting system.
The business said that its initial beta rollout of the voice chat moderation technology will begin in North America on 30 August 30 inside the existing games, Call of Duty: Modern Warfare II and Call of Duty: Warzone™, to be followed by a full worldwide release - excluding Asia - timed to Call of Duty: Modern Warfare III on 10 November.
Support will begin in English with additional languages to follow at a later date.
Recent Stories