January 20, 2023

Harassment affects athletes in sports

By Yann Guérin
View all posts

Share

Harassment in sports is a major issue, both on the field and in digital spaces. Athletes often face online threats, hate messages, discrimination, and slurs made against them on various social media platforms.

This type of toxic behaviour was recently analyzed by a report on the World Athletics Championships Oregon22. World Athletics published the findings of the study to illustrate the ongoing problem of online harassment in sports. 

Sport is far from being immune to toxicity. Athletes have to face not only the physical challenge of dealing with abuse in person but must also protect themselves against online hate.

Key figures on cyberbullying in sports

The World Athletics report on the World Athletics Championships Oregon22 found that:

  • 41% of male athletes were victims of online abuse
  • 59% of female athletes were victims of online abuse 
  • 40% of posts were sexist abuse against female athletes
  • 60% of abuse occurred on Twitter
  • 60% of online abuse was sexual or racial
  • 59% of abusive posts were moderated by social platforms

What are the types of harassment against athletes?

Harassment in sports takes many different forms. Athletes often have to deal with e-violence. The World Athletic report found that there were complaints about:

  • Slurs
  • Racism
  • Sexism
  • Ableist posts
  • Transphobia
  • Sectarianism or partisanship
  • Stalkers

Consequences of harassment on athletes

There have long been issues with sports and racism. Athletes often also face sexism and homophobia in sports. Many athletes have reported that their mental health has suffered because of repeated online harassment. Online abuse can severely impact athletes’ self-confidence and impair their ability to perform at their best. 

Effective moderation of sport-related online communities can be difficult, however. There are two main issues that human moderators face: content can be detected too late, and it can be difficult to identify.  

Toxic content detected too late

The sheer number of posts and comments that a sporting event can generate online is incredible. Human moderators simply cannot keep up with the avalanche of activity. In most cases, toxic content is not detected in time.

In order to solve this problem, organizations and clubs must take action and use an AI moderation solution that has knowledge of sports.

Toxicity in chat can be difficult to identify

It is hard for human moderators to identify offensive comments. A seemingly innocent chat or comment can have offensive racist overtones or a sexual connotation. Moderating chats where emojis or emotes are used can also be difficult.  

Harassment and online betting: What are the links?

Harassment in sports is often linked to online betting. If a person places a large bet on a team or individual sportsperson and does not get the result they wanted, then they may direct their anger and frustration at the team or athlete in question. This can result in toxic online behaviour. 

How to protect athletes from online harassment

Bodyguard.ai provides effective moderation solutions that can protect athletes and their fans from suffering online harassment. The smart and autonomous moderation solution employs machine learning and linguistic expertise to accurately identify and remove offensive and toxic comments before they are seen. 

FAQ

What are the consequences of cyberbullying on athletes?

Cyberbullying can negatively impact an athlete’s mental and physical health, and their performance. 

Which sports are most affected by cyberbullying?

All athletes are affected by cyberbullying, sexism, and racism in sport of all kinds. Female athletes report more instances of cyberbullying than male athletes. 

How can athletes be protected from online hate?

Using an AI moderation solution like Bodyguard ensures that toxic comments are removed before they appear online. 

What do stalkers risk?

Stalking can result in serious penalties, including imprisonment. 

Want to find out more about how Bodyguard can protect your club, team, league or players from harmful toxicity online? Let's chat.

Popular Insights

What is user-generated content and why does it matter?

By Gracia Kahungu|February 26, 2024
Read more

Bodyguard elevates content moderation with new language features

By Gracia Kahungu|December 19, 2023
Read more

TikTok moderation: why it’s important and how to do it

By Lucy Wright|September 15, 2023
Read more

New season, new rules: Why football clubs can’t tolerate online toxicity

By The Bodyguard Team |August 3, 2023
Read more
Solutions
Threat MonitoringCommunity ProtectionAudience UnderstandingIntegrations
Helpful Links
SupportTechnical Documentation
About
CompanyCareersContact us

© 2024 Bodyguard ai. All rights reserved worldwide.

Terms & Conditions|Privacy Policy