Signify - Better Data

View Original

PFA Report into online abuse in football


Today we have launched a new pilot study into online abuse commissioned by the PFA charity and supported by Kick it Out. The report has highlighted blindspots in the way social media platforms deal with online abuse and also the need for the football family to act as one in order to provide proper solutions to the impacts on the players, staff and fans.

The report analyses targeted abuse sent across the 6-week Project Restart period when the Premier League returned to complete the 19/20 season behind closed doors.

Our report was profiled on a Channel 4 News exclusive - view here.

Built on Signify’s Threat Matrix service, the report utilises machine learning to sift the signal from the noise in terms of targeted abuse that would result in offline sanction. The report helps to highlight, analyse and source online abuse.

Historically there has sometimes been reticence in tackling online abuse amongst the football family as many clubs, leagues and federations do not, rightly, want to get drawn into ‘policing the internet’ and do not know where their responsibility or duty of care begins and ends.

In order to help bridge this gap, Threat Matrix is deliberately designed to source, analyse and help identify abuse that is targeted at players and also where the abusive accounts have a relationship with the club. Often abusers have physical relationships with football, as season ticket holders or match attendees. Abuse and threat is not tolerated in stadiums, at training grounds or in-person and there should be no difference in the tolerance of targeted abuse online. Failure to address this issue can create security, performance and reputational issues that will ultimately harm Leagues, clubs, players, staff and fans alike.

See this chart in the original post

The report contains recommendations from the PFA Charity in how the football family can move forward in tackling these issues.

We will be writing further observations, including publishing the full study, in the coming days but there are several initial key areas to highlight:


Supporting victims?

Currently there has been a drive to increase the reporting of online abuse and threat.

Making it the norm to speak out and report abuse or threat is vital. However, when Federations, Leagues or Clubs ask that victims be responsible for monitoring and reporting content directed at them this can lead to precisely the kind of harms it is meant to avoid.

The reliance on victims driving action is actively encouraging them to focus on their abuse. This can have seriously detrimental impacts on both mental health and performance for athletes.

Ultimately, abdicating responsibility to victims rather than looking to source and tackle targeted online abuse in a comprehensive manner is likely to create the kind of harms it is seeking to tackle. Fulfilling the duty of care towards players, staff and fans requires proactive monitoring, documenting and investigation of targeted abuse and threat whilst also making sure there are support measures in place, including the ability to take away reliance on players to be solely responsible for driving this process, that ensures no one is subjected to abuse alone and everyone can focus on football.

Manchester City’s Raheem Sterling was the top recipient of targeted online abuse on Twitter in across Project Restart.

Whack-a-mole, not the goal

Whilst platforms removing abusive content and accounts such as those highlighted in the report is a good short-term relief it does not present a sustainable. We can clearly see abuse evolving to avoid detection and in the same way that emojis are deployed to avoid keyword analysis determined abusers are now creating multiple back up accounts, avoiding posting direct content, just responsive or ‘live’ content such as replies and Stories on Instagram, or Facebook. All of which make accounts seem like they are dormant/unused, making it much harder to assess or report content and prolonging the lifespan of accounts.

Sustainably tackling online abuse has to go beyond simply de-platforming and removing content.

Emoji’s made up 29% of the abuse identified on Twitter across Project Restart.

Platform unity

One of the major hurdles to tackling online abuse is the lack of a level playing field and co-ordination across different platforms in how abusive accounts or content are dealt with. Every platform has its own rules, terms, reporting and sanction structures.

It is a common mantra that responsibility lies with social platforms and it is they who must act first. Whilst this is a vital element it cannot be overlooked that social platforms are exceptionally unlikely to ever look beyond their own users and content unless they are drawn into a standardised, regulatory enforced, framework.

Signify frequently see abusers who utilise accounts across numerous social platforms in order to post targeted abuse or threats utilising different formats and content dependent on the platform. Where suspensions or bans are applied users simply leapfrog from platform to platform. 

In the current environment, platforms and members of the football family need to work with third parties who can apply an holistic view to an abusive accounts whole digital footprint. Threat Matrix offers this service - and we hope to continue our association with the PFA and Kick It Out in continuing to provide insights like these - illuminating the true picture, scale and tactics used in online abuse.

You can read the summary report on the PFA’s website here.

Other media coverage of the report:

  • Sky Sports News coverage of the report - view here.

  • Daily Mail coverage of the report - view here.

  • BBC News coverage of the report - view here.

  • The Sun coverage of the report - view here.

  • The Mirror coverage of the report - view here.

  • The Athletic coverage of the report - view here (paywalled)

For any media enquiries or data requests, please contact us here.