UK academics put spotlight on big tech platform regulation
June 3, 2021
By Colin Mann
With much of our time spent on online platforms, which are becoming increasingly important and influential in people’s daily lives, how they are regulated is becoming a vitally important issue.
Now high-profile research conducted as part of the UK Arts & Humanities Research Council Creative Industries Policy and Evidence Centre (PEC) has analysed how eight government and parliamentary reports and inquiries over an 18-month period have highlighted and set the agenda for platform regulation.
The reports, published between September 2018 – February 2020, dealt with issues such as online harms, cybercrime, and the regulation of social media platforms.
The University of Glasgow researchers’ analysis has revealed that some 80 distinct online harms have been discussed. The regulatory landscape is cluttered with no less that nine different UK agencies with responsibilities.
“Fake news, cyber-attacks, predatory acquisitions,” listed Professor Martin Kretschmer, Professor of Intellectual Property Law and Director of the UK Copyright and Creative Economy Centre (CREATe) at the University of Glasgow, one of the report authors. “Dangerous things are happening on online platforms. But how do we, as a society, make decisions about undesirable activities and content?”
“UK policy makers hope to delegate tough choices to the platforms themselves, focussing on codes of practice and codes of conduct supervised by regulatory agencies, such as Office of Communications (Ofcom), for a new ‘online duty of care’, and competition regulator Competition and Markets Authority (CMA), through a ‘digital markets unit’. Our new empirical study shows how this approach emerged, and how it compares in a global setting.”
The researchers also signal concern that the evolving regulatory structure appears to be blind to the effects of platforms on cultural production and diversity. Understanding the role of ranking and recommendation algorithms as cultural gatekeepers still needs to be integrated into the platform policy agenda.
“Platform regulation is now at the heart of how democracies conduct themselves,” stated Professor Philip Schlesinger, Professor in Cultural Theory (Centre for Cultural Policy Research and CREATe). “It’s also increasingly at the core of how we manage rules for our digital social connectedness.
“So, understanding how regulation works and the forces that are shaping it have become crucial for everyone. It’s important that the present rush to regulate doesn’t ignore the huge contribution of creative industries to the cultural economy. And since the UK’s multinational diversity has been thrown increasingly into relief by Brexit and the pandemic, how regulatory policy plays out will be of special interest to the devolved administrations.”
The researchers found as platform regulation has become an important government issue, this major focus has seen two clear priority areas emerge: online harms and competition. These broad categories include everything from mental health to intellectual property rights.
The picture the researchers paint is of a complex and potentially confusing policy environment. There is a limited consensus on what regulation could look like, or even how to define key terms like ‘platform’.
The research also found that US multinationals – Google and Facebook in particular – have captured regulators’ attention, and they dominate references in the official literature.