By Professor Carla Ferstman, Director of Impact, Essex Law School
As academics, we conduct research for all sorts of reasons. We seek to advance knowledge and innovation in the areas in which we specialise, and we try to make connections with research being done in other disciplines for the purpose of enhancing our understanding of and contributing to address cross-cutting, complex challenges.
Academic research is increasingly being applied outside of academia to foster external impacts in our communities and societies. Research-led teaching can also foster the opportunities for cutting-edge, student learning.
The UK Research Excellence Framework values world-leading research that is rigorous, significant and original. It also encourages and rewards research that generates impact, which it understands as “an effect on, change, or benefit to the economy, society, culture, public policy or services, health, the environment or quality of life, beyond academia” (REF2021).
Impactful research is particularly relevant and important for the discipline of law, where colleagues’ work can lead to changes in how justice is perceived and how access to justice can be better achieved. Academic research in law has led to and influenced the direction of law reform and academic findings have also been applied authoritatively in court judgments. Legal research has also led to the development of new policies, and regulatory frameworks in the UK and internationally.
Despite the importance many legal academics place on generating impact, the route to impact is not obvious. Achieving impactful academic research defies a one-size-fits-all formula, though certain key pointers are invaluable:
First, impactful research is generated by academics who produce excellent, ground–breaking research.
Second, academics should be mindful of who (e.g., community stakeholders, policy-makers, decision-makers) would benefit from knowing about the research and should develop a strategy to ensure they effectively disseminate their findings.
Third, academics seeking to generate impactful research should be actively engaging with those who can benefit from their research, adapting their approach based on stakeholder needs and circumstances.
Learning from example
Academics can glean wisdom from exemplary models. And there is no better example than Professor Lorna Woods, whose research contributed significantly to the Online Safety Bill (now Online Safety Act 2023) and led to her being awarded an OBE for services to internet safety policy.
I sat down with Professor Woods to get a clearer understanding of her trajectory – how she got from A to B to C (or indeed, from B to A to F to C), to better appreciate the time her ideas took to percolate and the challenges she faced along the way.
I wanted to understand whether her research was picked up by government by happenstance, by carefully, plodded planning, or some other combination. I also wanted to know whether there was any magic formula she could share to generating impactful research.
Lorna qualified as a solicitor and worked in the early 1990s for a London city firm, where she was exposed to a variety of areas of law, including international trade, competition, and commercial law. She began to work with two of the partners on matters involving regulation, intellectual property, and media. She happened to be at the firm when many developments in the law occurred, such as the Broadcasting Act 1990, up-dates in data protection rules, and other changes as a result of growing public access to the internet.
This quickly developed into a specialism related to technology. “The work was really interesting. It wasn’t just the typical due diligence or deals management work that one often received in a corporate solicitor’s firm, there was a space to think and a space to have your say”.
Also, during this time, Lorna did some consulting work for the European Commission in Eastern European countries following the political changes in the early 1990s, focused on media freedom and public service broadcasting, which involved new thinking about the rights of the public audience that had not yet been theorised.
Lorna left the firm after about five years when, as often happens, she began to take on a more supervisory role, with some of the most interesting pieces of work being delegated to more junior colleagues. She pursued an LL.M degree at the University of Edinburgh (legal theory and human rights, with a dissertation on federalism and the European Union) and began to apply for academic roles. She secured a position in 1994 at Sheffield and began teaching EU and public law.
The Eureka moment or more of a slow-burner?
Gradually Lorna’s research began to drift back to media law and data protection, incorporating areas she had been studying around human rights, public speech, surveillance, and the rights of journalists, but with her own take. She recalled that “A lot of people were talking about journalists’ rights, but I was focussed on the rights of the companies who were transmitting; an ‘essential facilities’ argument but approached from a rights perspective. I also started looking at these issues from the perspectives of EU law and the free movement of cultural standards [the rights of the audience] rather than simply as an issue of freedom of expression.”
Central to this was the idea that there were different actors in an information environment – the speakers and the audience, and something in the middle which had more to do with the platform, that is not really seen or thought about. The question Lorna had was whether these entailed separate rights or were all part of a unified right to information.
In 2000, Lorna was collaborating with Professor Jackie Harrison at Sheffield and they began researching new media and media regulation, and again, this is where she conceptualised further her thoughts on the rights of the audience not only to have access to information, but to information that was reasonably reliable, and where possible, to a diversity and plurality of sources.
This also connected to her thinking about how to find information on the internet, who curates what we can find and what responsibilities may be attached to the curation. The flip side to this was considering the nature of states’ positive obligations to provide a safe online environment. Lorna also began to explore issues around user–generated content.
In response to the growing awareness of how female politicians and activists were being targeted on Twitter (now X), and the notoriety of the abuse faced by Caroline Criado Perez and Walthamstow MP Stella Creasy, Lorna started looking at what controls were in place, and began to consider the gaps in regulation and how they could best be addressed.
At the time, she observed that politicians had embraced Twitter, amplifying their influence while also making them more accessible and exposed. The platform facilitated direct communications between everyone on the network, including with unsavoury individuals who were using the platform as a form of abuse. This was fuelled by anonymous accounts, hashtags that allow you to jump on the bandwagon, and little seeming moderation at that stage. There were many instances of public-facing women receiving rape and death threats.
In consequence, there were several instances in which users were being charged in the UK under section 127 of the Communications Act – a low-grade offence which criminalises the sending, via a “public electronic communications network”, of a message which is “grossly offensive or of an indecent, obscene or menacing character”. But it was never clear to Lorna that using the criminal law was the best solution to the problem.
The campaign for law reform begins to take shape
Around 2015, Lorna became aware that the then Labour MP Anna Turley MP was developing a private member’s bill: the Malicious Communications (Social Media) Bill. Someone whom Lorna had met in an unrelated capacity – “this is just really a feature of when you work in a certain area, you meet people linked to that area. And progressively, your army of contacts comes back to help” – William Perrin, managed to get her in the door to meet the MP.
Together, Lorna and William helped to draft the Bill. The goal was to give users better tools (user empowerment features and functionalities) so that they could filter and triage incoming content, at least as a starting point for improving the online environment. Their advice (which was taken on board) was not to remove platform immunity for third-party content; they recognised that the platform providers were offering an important service worth protecting.
Part of the rationale for this was the connections they saw between internet platform providers and telecoms providers: “If you were to hold a telecoms provider responsible for anything communicated on the service, they would become very cautious and ultimately it would shut down the service. So, there was a need for caution.” Ultimately the Bill did not progress because private members’ bills rarely do but they operate to bring matters to the attention of the Government and can be part of a campaign for change.
Subsequently, the Government published a Green Paper on internet safety in 2017, where significant concerns were raised. This was the era of Cambridge Analytica and misinformation, but there were also concerns about child pornography and online bullying, and the algorithms prioritising content to vulnerable users stemming from the tragic Molly Russell case. The Green Paper seemed to revisit the recommendation to remove (or significantly restrict) platform immunity for third-party content, which Lorna and William did not think was the best approach, for the reasons already stated.
There was a need to conceive of the problem at the systems level, rather than merely focusing on isolated items of content. For example, the scale of the problem invariably was not about the individual offensive posts but that the content was quickly able to go viral without appropriate controls, aided by functions like the “like” button, and the availability of anonymous, disposable accounts.
Similarly, the recommender algorithm which optimised certain posts for engagement tended to privilege the most irrational, emotional posts which were more likely to promote hatred or cause offence. Making small changes to these kinds of features and investing more in customer response, could significantly improve online safety. Thus, according to Lorna, there was a certain recklessness in the product design that needed to be addressed – this was the genesis of the idea of a statutory duty of care.
The statutory duty of care
Lorna and William produced a series of blogs and papers outlining this position, and the need for such reforms was also underscored by Lorna during an oral evidence session at the House of Lords inquiry into the regulation of the internet. The Carnegie UK Trust stepped up to champion Lorna and William’s work, facilitating its progress.
The UK Department for Culture, Media and Sport (DCMS) invited Lorna to give a briefing, and it became clear that there was some confusion. The DCMS had been under the impression that the conditionality of the platform immunity amounted to a statutory duty of care. Consequently, part of what Lorna and Will tried to explain was how their proposal was compatible with the principle of platform or intermediary immunity. The proposal was not seeking to impose liability on the platform for user content but instead, focused on requiring platforms to ensure product design met their duty of care to users. These discussions with DCMS continued, and progressively intensified.
The White Paper which was ultimately released in April 2019 clearly articulated that “The government will establish a new statutory duty of care to make companies take more responsibility for the safety of their users and tackle harm caused by content or activity on their services,” and outlined what that duty of care would look like and how it would be regulated.
Changes within the Tory leadership ultimately delayed progress. There were also concerns raised by some of those in the free speech lobby who saw parts of what was being proposed as censorship. Lorna’s background in freedom of speech helped her respond to those concerns: “I was concerned that freedom of speech was being used as a slogan. When you look at any right and you look at it in isolation, you are then implicitly privileging it. And here, it was important not just to consider the rights of the ‘speaker’ but the rights of all the other users as well, some of whom are extremely vulnerable.”
These points align with what the UN Special Rapporteur on Freedom of Opinion and Expression explained in her 2023 report on gendered disinformation, who notes, citing Lorna’s submission, that “Systemic regulation, which emphasizes ‘architecture over takedown’, allows for more proportionate responses and is likely to be better aligned with freedom of expression standards.”
Certainly, companies were lobbying in other directions and the Act reflects some corporate compromises, such as the need for the duty of care to be applied proportionately, to account for the different levels of resources of the regulated company. But there were powerful counter-arguments, and the NSPCC and other organisations were effective allies particularly on the need for clear duties of care in relation to child users. The Daily Telegraph also ran an important campaign on the legislation. The Government at one point sought to restrict the Act to concerns about children, so this became part of the campaign to maintain a focus also on harm to adults (unfortunately only limited protections were maintained). There are other parts of the Act which differ from what Lorna and William had proposed, such as dividing up the regulatory framework by reference to certain types of conduct. Inevitably there were compromises.
The Act as adopted envisages that the communications regulator Ofcom will produce guidance and codes which will explain what internet platforms must do in order to operate in the United Kingdom. There are ongoing consultations regarding these texts. Once the guidance and codes are in place, companies will be given a period (three months) to align their practice to comply with the requirements. Thereafter, the duties of care will become binding.
Some of the companies appear to be arguing that a duty of care is too vague a standard, however this is hard to accept, given that it is a recognised legal standard. The goal for Lorna and others is therefore to ensure that the duty of care standard is made operational in such a way that it provides clear and adequate protections; it should be more than a ‘tick the box’ exercise.
I asked Lorna how this legislation would tackle the activities of companies operating outside of the UK, but with impacts in the UK. She explained that parts of the Act have extraterritorial effect, to the extent that company activities are directed at or have impacts in the UK. Some companies have introduced policies for different geographical regions to address the requirements of national legislation, so this is a possibility for multinational internet platforms accessible to UK users.
I also discussed with Lorna whether she believed individuals like Molly Russell would be more effectively safeguarded now that the Online Safety Act is in force. She explained that Molly would not be better off today, because the guidance and codes are not yet in place. “Maybe in a year’s time, she would probably be better protected, as a child. I think an 18-year-old Molly would be sadly let down by the regime, which should be more robust.”
Given the clear synergies with her work on the Act, Lorna is also progressing with work on online gender-based violence, and some work on gender-misinformation, incel and extremism. As she looks deeper into these critical areas, it becomes evident that her ongoing endeavours reveal new challenges and fresh avenues for advocacy and change.