Modulate Adds to ToxMod Voice Moderation Platform
Modulate, creators of voice technology for online gaming communities, has added the Violent Radicalization detection category in its ToxMod voice chat moderation software, making ToxMod capable of identifying individuals promoting white supremacist and white nationalist radicalization and extremism in real time.
In addition to the Violent Radicalization category, ToxMod also detects bullying, racial and cultural hate speech, gender and sexual hate speech, and more.
The newly introduced Violent Radicalization category aims to address critical concerns within the gaming community by identifying and flagging speech that can be used for promotion, recruitment, targeted grooming, and actively plotting physical violence.
"We are committed to creating a safer and more inclusive gaming environment," said Mike Pappas, CCEO and co-founder of Modulate, in a statement. "With the rise of extremism and radicalization in video games, our new Violent Radicalization detection category equips game studios with the necessary tools to combat the spread of extremist ideologies on their platforms."
Nintendo Switch developers can now protect players from toxicity in voice chat.