Activision punishes over 2 million players for toxic behaviour in COD

Activision has punished over 2 million players after its AI-driven voice moderation tool detected toxic behaviours in Call of Duty online gaming community

author-image
Sarah Andrew
New Update
COD

Call of Duty (Source: X)

Last year, Activision announced that they were working on a voice chat moderation for Call of Duty (COD). This was implemented in their latest release of Modern Warfare III and is reportedly working well.

In the most recent update of COD, it was revealed that more than 2 million accounts saw strict action. This was done after the players were caught violating community voice-chat guidelines.

As a result, COD reported a 50% drop in toxic behaviours of the players in the last four months. The punishment includes server mute from both voice and text chat. In severe cases, players can face a temporary ban.

Activision plans to encourage people to report toxic gamers

While the fastest way to get banned is cheating, being excessively toxic can also have major repercussions. Players may not get banned the first few times or just receive a light time-out. If caught again multiple times, they can say goodbye to their account.

A friendly banter with people you know in both opponent and team is nothing big. ‘Trash Talk’ in COD is even considered a norm. However, many times these things lead to bigger problems like bullying and hate speech which need punishment.

While their system is detecting toxic players, manual reports are still low, as only one out of five reports someone misbehaving. “To encourage more reporting, we've rolled out messages that thank players for reporting and in the future, we're looking to provide additional feedback to players when we act on their reports,” the company said in a statement.

Regardless of reports, AI-driven voice moderation will detect misbehaving players and punish them accordingly, even if they remain unreported manually.

online gaming Activision Call of Duty