On 18th June, the United Nations International Day for Countering Hate Speech, FIFA released a report detailing online abuse during the 2022 FIFA World Cup in Qatar.
According to the report, FIFA’s Social Media Protection Service (SMPS) tool scanned over 20m posts, detecting over 19,600 abusive posts and automatically hiding approximately 290,000 comments. 434,000 posts were automatically flagged by AI and checked by human content moderators. View the report here.
A 74% majority of abuse was traced back to users in Europe and South America. The England v France quarter-final match caused the most significant spike in abusive activity, followed by the final and Morocco v Portugal. France was targeted by more abuse than any other country, particularly racist abuse, followed by Brazil and England. Abuse directed towards Brazil was more sectarian in nature.
The system was deployed at AFCON 2021 and the EURO 2020 finals, but FIFA highlighted that their current system is considerably more sophisticated.
In a statement, FIFA President Gianni Infantino said, “Discrimination is a criminal act. With the help of this tool, we are identifying the perpetrators and we are reporting them to the authorities so that they are punished for their actions. We also expect the social media platforms to accept their responsibilities and to support us in the fight against all forms of discrimination. Our position is clear: we say no to discrimination.”
In reaction to the report, FIFPRO President David Aganzo noted, “The figures and findings in this report do not come as a surprise, but they are still massively concerning… Football has a responsibility to protect the players and other affected groups around their workspace. Therefore, FIFPRO and FIFA will continue their collaboration… But we cannot do this alone – we need all stakeholders to play their part if we want to create a safer and better environment for football.”
FIFA is taking a strong stance on racism and abuse
The analysis revealed that sexism or sexual-related abuse constituted 13.47%, homophobia accounted for 12.16%, and racism comprised 10.70%.
The report states, “Social media companies’ responses to abuse and threats published on their platforms evolved throughout the tournament but still indicated many blind spots, particularly outside of English language content.”
“Targeted individual racism was high volume with more than 300 players being targeted and a few individual high-profile players receiving a large proportion of targeted abuse across the competition.”
Twitter received the highest number of reported abusive messages at 13,105, followed by Instagram (5,370), Facebook (979), YouTube (113), and TikTok (69).
The report underscores FIFA’s commitment to tackling online abuse in collaboration with FIFPRO. This led to the creation of the Social Media Protection Service (SMPS) in 2020 – a set of tools designed to safeguard players and the public at FIFA events.
In a move to protect players and teams, FIFA offered them access to moderation software that could instantly hide offensive comments from their pages. Players from all 32 teams were granted access to the tool amidst growing rows about racism and abuse on and off the pitch.
This led to an astounding 286,895 comments being automatically hidden from public view.
Moreover, the identities of over 300 individuals who posted abusive, discriminatory, or threatening comments have been shared with relevant club associations and legal authorities.
The Women’s World Cup Australia & New Zealand 2023 is imminent, and the SMPS will once again spring into action, with several teams already agreeing to use it.