Study: 7 in 10 Premier League players face Twitter abuse
August 3, 2022
By Colin Mann
As the new season warms up for kick-off, research from comms regulator Ofcom reveals the scale of personal attacks suffered by Premier League footballers every day on Twitter, and sets out what must be done collectively to tackle the issue.
Ofcom, which is preparing to regulate tech giants under new online safety laws, teamed up with The Alan Turing Institute to analyse more than 2.3 million tweets directed at Premier League footballers over the first five months of the 2021/22 season.
The study created a new machine-learning technology that can automatically assess whether tweets are abusive. A team of experts also manually reviewed a random sample of 3,000 tweets.
The vast majority of fans use social media responsibly. Of the manually-reviewed random sample of 3,000 tweets, 57 per cent were positive towards players, 27 per cent were neutral and 12.5 per cent were critical. However, the remaining 3.5 per cent were abusive. Similarly, of the 2.3 million tweets analysed with the machine-learning tool, 2.6 per cent contained abuse.
Hundreds of abusive tweets are sent to footballers every day. While the proportion of abusive tweets might be low, this still amounts to nearly 60,000 abusive tweets directed towards Premier League players in just the first half of the season – an average of 362 every day, equivalent to one every four minutes. Around one in twelve personal attacks (8.6 per cent) targeted a victim’s protected characteristic, such as their race or gender.
Seven in every ten Premier League players are targeted. Over the period, 68 per cent of players (418 out of 618) received at least one abusive tweet, and one in fourteen (7 per cent) received abuse every day.
A handful of players face a barrage of abuse. Ofcom recorded which footballers were targeted, and found that half of all abuse towards Premier Leaguers is directed at twelve particular players. These players each received an average of 15 abusive tweets every day.
Ofcom also asked the public about their experiences of players being targeted online through a separate poll. More than a quarter of teens and adults who go online (27 per cent) saw online abuse directed at a footballer last season. This increases to more than a third of fans who follow football (37 per cent) – and is higher still among fans of the women’s game (42 per cent).
Among those who came across abuse, more than half (51 per cent) said they found the content extremely offensive, but a significant proportion didn’t take any action in response (30 per cent). Only around one in every four (26 per cent) used the flagging and reporting tools to alert the abusive content to the platform, or marked the content as junk.
“These findings shed light on a dark side to the beautiful game,” commented Kevin Bakhurst, Ofcom’s Group Director for Broadcasting and Online Content. “Online abuse has no place in sport, nor in wider society, and tackling it requires a team effort. Social media firms needn’t wait for new laws to make their sites and apps safer for users. When we become the regulator for online safety, tech companies will have to be really open about the steps they’re taking to protect users. We will expect them to design their services with safety in mind.”
“Supporters can also play a positive role in protecting the game they love. Our research shows the vast majority of online fans behave responsibly, and as the new season kicks off we’re asking them to report unacceptable, abusive posts whenever they see them,” he added
“These stark findings uncover the extent to which footballers are subjected to vile abuse across social media,” noted Dr Bertie Vidgen, lead author of the report and Head of Online Safety at The Alan Turing Institute. “Prominent players receive messages from thousands of accounts daily on some platforms, and it wouldn’t have been possible to find all the abuse without these innovative AI techniques.”
“While tackling online abuse is difficult, we can’t leave it unchallenged. More must be done to stop the worst forms of content to ensure that players can do their job without being subjected to abuse,” he concluded.
The UK is set to introduce new laws aimed at making online users safer, while preserving freedom of expression. The Online Safety Bill will introduce rules for sites and apps such as social media, search engines and messaging platforms – as well as other services that people use to share content online.
The Bill does not give Ofcom a role in handling complaints about individual pieces of content. The Government recognises – and Ofcom agrees – that the sheer volume of online content would make that impractical. Rather than focusing on the symptoms of online harm, Ofcom will tackle the causes – by ensuring companies design their services with safety in mind from the start. It will examine whether companies are doing enough to protect their users from illegal content, as well as content that is harmful to children.