FIFA revealed that one in five players faced online abuse during the 2023 Women's World Cup, marking a 29% increase compared to the previous year's men's finals. The Social Media Protection Service (SMPS), deployed at the tournament, hid nearly 117,000 comments to monitor and moderate hate speech on platforms like Facebook, Instagram, TikTok, and YouTube.
Approximately 150 female players received discriminatory or threatening messages, with the US and Argentina teams being identified as key targets. Almost 50% of detected abusive messages were homophobic, sexual, or sexist.
FIFA and FIFPRO stressed the importance of addressing this issue, emphasizing the impact of the toxic online environment on players' mental health and well-being. The SMPS tool, utilizing AI, aims to protect players and prevent their followers from exposure to hate speech.
Why does it matter?
Instances of online abuse in football have made headlines worldwide, including episodes of racism, misogyny, and homophobia. For example, to address the surge in misogyny, West Midlands Police in England has appointed a full-time investigator to tackle the issue. Another approach to addressing this surge involves Arsenal legend Gilberto Silva, who, driven by personal encounters with racism and online threats, has introduced the Striver social media platform, which employs AI to actively prevent abusive comments.