The Criminalisation of Cybercrime: Connected Dots and Blind Spots in the Development of Legal Instruments

Photo by Markus Spiske on Unsplash

Building on her 15-year research on cybercrime, Dr. Audrey Guinchard, Senior Lecturer at the Essex Law School, gave a presentation on the criminalisation of cybercrime at the 2022 Society of Legal Scholars (SLS) Conference, held on 6-9 September at King’s College London.

In her paper, Dr. Guinchard explained that regulating crime is the traditional domain of nation states; cybercrime is no exception. The first legal instruments to tackle computer-focused crimes (e.g., unauthorised access or hacking) date back to the seventies and eighties. Yet, international institutions such as the OECD and the Council of Europe have quickly recognised the transborder nature of cybercrime, keen to push for the creation of a level-playing field and better cooperation among nation-states. In fact, we could even argue that international efforts of criminalisation are concomitant, if not anticipatory, of national legal instruments on cybercrime.

Dr. Guinchard pointed out that what is less known behind this push for harmonisation is the role of the computing community, a scientific community which has international dialogue at its heart and which has frequently engaged with legal professionals more than legal professionals have engaged with computer scientists. These key features of the criminalisation of cybercrime continue to shape modern legislation as the movement for reforming the UK Computer Misuse Act demonstrates.

Yet, Dr. Guinchard emphasised that blind spots remain: comparative law analyses can be superficial; the international outlook remained dominated by Western/European countries, ignoring the many voices of Asia, Africa and Latin America; the link between improving cybersecurity and decreasing cybercrime remains unappreciated; and criminalisation can carry hidden agendas which turn the fight against cybercrime into a battleground of values, as the recent push for the UN treaty on cybercrime illustrates.

So, if the transborder nature of cybercrime has long been a rallying cry for its worldwide criminalisation, the resulting legal frameworks continue to be subjected to various influences and forces, acknowledged and unacknowledged, leading to a paucity of information as to how effective the law is in tackling cybercrime. Dr. Guinchard argued that reflecting on those pathways to criminalisation may allow us to move away from these hypes and understatements which have marred the field since its inception.

A copy of Dr. Guinchard’s slides can be downloaded below. She can be contacted at this email address:

‘Cyber Due Diligence’: A Patchwork of Protective Obligations in International Law

Photo by Kevin Ku

With a long history in international law, the concept of due diligence has recently gained traction in the cyber context, as a promising avenue to hold states accountable for harmful cyber operations originating from, or transiting through, their territory, in the absence of attribution.

Nonetheless, confusion surrounds the nature, content, and scope of due diligence. It remains unclear whether it is a general principle of international law, a self-standing obligation, or a standard of conduct, and whether there is a specific rule requiring diligent behaviour in cyberspace.

This has created an ‘all-or-nothing’ discourse: either states have agreed to a rule or principle of ‘cyber due diligence’, or no obligation to behave diligently would exist in cyberspace.

In their new article in the European Journal of International Law, Dr. Antonio Coco, Lecturer in Law at the University of Essex, and Dr. Talita de Souza Dias, Postdoctoral Research Fellow at the Oxford Institute for Ethics, Law and Armed Conflict (ELAC), propose to shift the debate from label to substance, asking whether states have duties to protect other states and individuals from cyber harms.

By revisiting traditional cases, as well as surveying recent state practice, the authors contend that – whether or not there is consensus on ‘cyber due diligence’ – a patchwork of different protective obligations already applies, by default, in cyberspace.

At their core is a flexible standard of diligent behaviour requiring states to take reasonable steps to prevent, halt and/or redress a range of online harms.

A copy of the authors’ article can be accessed here.

This is an Open Access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted reuse, distribution, and reproduction in any medium provided the original work is properly cited.

Article full citation: Antonio Coco, Talita de Souza Dias, ‘Cyber Due Diligence’: A Patchwork of Protective Obligations in International Law, European Journal of International Law, Volume 32, Issue 3, August 2021, Pages 771–806,

Effective Oversight of Large-Scale Surveillance Activities: A Human Rights Perspective

Photo by Lianhao Qu

Daragh Murray, Pete Fussey, Lorna McGregor, and Maurice Sunkin, University of Essex, explore the international human rights law implications of state surveillance in a new article published in the Journal of National Security Law and Policy (JNSLP).

Today, state surveillance involves the large-scale collection and analysis of digital data—activities which allow for widespread monitoring of citizens. And while commentary on the legality of these bulk surveillance regimes has focused on whether this routine surveillance is permissible, the European Court of Human Rights has recently held that, subject to appropriate safeguards, surveillance of this type is legitimate, and sometimes necessary, for national security purposes in a democratic society.

In their analysis, the authors outline the types of oversight mechanisms needed to make large-scale surveillance human rights compliant. To do so, they break down state surveillance into its constituent stages—authorization, oversight, and ex post facto review—and focus their attention on the first two stages of the process.

First, they argue that effective oversight of authorizations requires increasing data access and ensuring independent judicial review.

Second, they argue that effective oversight of ongoing surveillance requires improving technical expertise and providing for long term supervision.

The authors conclude that a “court-plus” model of judicial officers and non-judicial staff would deliver enhanced judicial qualities to authorizations while also providing continuous engagement through ongoing review and supervision.

This post was first published on the JNSLP website and is reproduced here with permisson and thanks. The original piece and a link to the authors’ article can be found here.

Enhancing Cross-Border Access to Electronic Information in Criminal Proceedings: Towards a new E-Evidence legal framework in the EU

Photo by Christian Lue on Unsplash

Dr Oriola Sallavaci, Senior Lecturer in Law, University of Essex

In recent years cross-border exchange of electronic information has become increasingly important to enable criminal investigations and prosecutions. As I have discussed in depth in my study “Rethinking Criminal Justice in Cyberspace: The EU E-evidence framework as a new model of cross-border cooperation in criminal matters” the use of technology has transformed the nature of crime and evidence leading to ‘crime without borders’ and ‘globalisation of evidence’. An increasing number of criminal investigations rely on e-evidence and this goes beyond cyber-dependent and cyber-enabled crimes. From an evidential point of view, today almost every crime could have an e-evidence element as often offenders use technology, such as personal computers, notepads, and camera phones, where they leave traces of their criminal activity, communications or other pieces of information that can be used to determine their whereabouts, plans or connection to a particular criminal activity. 

Crime today often has a cyber component and with it an increasingly prominent cross border dimension because electronic information to be used for investigative or evidentiary purposes is frequently stored outside of the investigating State. The borderless nature of cyberspace, the sophistication of the technologies and offenders’ modii operandi pose specific and novel challenges for crime investigation and prosecution that, in practice, may lead to impunity.  In 2018 the European Commission found that in the EU “more than half of all investigations involve a cross-border request to access [electronic] evidence.” Yet, alarmingly, “almost two thirds of crimes involving cross-border access to e-evidence cannot be effectively investigated or prosecuted”. Challenges to accessibility relate inter alia to the volatility of e-information, availability and the location of data, as well as the legislative barriers and shortcomings that must be overcome to enhance cross-border access to electronic evidence and the effectiveness of public-private cooperation through facilitated information exchange.

Cross border access to e-information is currently conducted through traditional judicial cooperation channels and requests are often addressed to specific states which are hosts to many service providers (SP). In the EU these include Mutual Legal Assistance requests and European Investigation Orders according to Directive 2014/41/EU which provides for the acquisition, access and production of evidence in one Member State (MS) for criminal investigations and proceedings in another Member State.  The nature of the existing judicial cooperation instruments, actors and procedures involved, and the ever-increasing number of requests have resulted in delays and inefficiencies, posing specific problems for investigations and prosecutions that are exacerbated by the volatility of electronic information.

In the EU, there is no harmonised framework for law enforcement cooperation with service providers. In recent years, Member States have increasingly relied on voluntary direct cooperation channels with service providers, applying different national tools, conditions and procedures. Service providers may accept direct requests from LEAs for non-content data as permitted by their applicable domestic law. However, the fragmented legal framework creates challenges for law enforcement, judicial authorities and service providers seeking to comply with legal requests, as they are increasingly faced with legal uncertainty and, potentially, conflicts of law.

Cross border access to electronic information requires legal instruments that are capable of efficiently supporting criminal investigations and prosecutions and that, at the same time, have in place adequate conditions and safeguards that ensure full compliance with fundamental rights and principles recognised in Article 6 of the Treaty on European Union, the EU Charter of Fundamental Rights and the European Convention on Human Rights, in particular the principles of necessity, legality and proportionality, due process, protection of privacy and personal data, confidentiality of communications, the right to an effective remedy and to a fair trial, the presumption of innocence and procedural rights of defence, as well as the right not to be tried or punished twice in criminal proceedings for the same criminal offence.

In order to achieve these objectives and overcome difficulties present in the existing mechanisms of cross-border cooperation, in April 2018 the EU Commission proposed an important legislative package referred to as “E-evidence”, aimed at facilitating the access to e- evidence by European law enforcement agencies (LEAs). The framework contains two legislative measures: a Regulation which provides two new mechanisms for LEA’s cross border access to e-evidence: European Production Order and European Preservation Order which are to be addressed directly by LEAs of the issuing MS to a service provider, and a  Directive which requires every online service provider “established” in or that has “substantial connection” to at least one EU Member State to appoint a legal representative in the territory of an EU MS of choice as an addressee for the execution of the above Orders.

On 7 December 2018 the Council adopted its own draft (known as Council’s “general approach”) and after two years of delays caused partially from the EU parliamentary elections and the Covid-19 pandemic, on 11 December 2020 The EU Parliament adopted its position. On 10 February 2021 the ‘trilogue’ procedures amid the EU Parliament, the Council, and the Commission started in order to agree to a common text. In the study cited above, I have analysed in depth the key legal provisions contained in the Commission’s proposal, the Council’s draft and the report of the LIBE’s rapporteur Birgit Sippel, presented to the EU Parliament in 2020. Considering that the E-evidence framework is currently being negotiated, the study’s analysis and findings aim to contribute to achieving the best version of the forthcoming instruments.

The EU E-evidence framework is of particular importance in shaping the future of similar instruments and the terms of cooperation between countries all over the world. To a certain extent, it follows the US CLOUD Act 2018 that in itself marks a major change in how cross-border access to e-evidence may develop in the rest of the world. The EU E-evidence framework shall influence and at the same time needs to conform to a number of new agreements currently being negotiated. In 2019 the EU Commission received a negotiating mandate to achieve an agreement between the EU and US, as well as to shape the second amending protocol of the Cybercrime Convention (CCC). Both these instruments need be negotiated from the perspective of the forthcoming E-evidence framework, therefore it is important that the latter offers provisions that increase the efficiency of investigations and prosecutions by surpassing challenges in cross-border cooperation, while maintaining safeguards to fundamental rights of individuals.

The E-Evidence legislative package lays down the rules under which, in a criminal proceeding, a competent judicial authority in the European Union may directly order a service provider offering services in the Union to produce or preserve electronic information that may serve as evidence through a European Production or Preservation Order. This framework will be applicable in all cross-border cases where the service provider has its main establishment or is legally represented in another Member State. The framework aims to complement the existing EU law and to clarify the rules of the cooperation between law enforcement, judicial authorities and service providers in the field of electronic information.  The new measures for cross border access to e-evidence will not supersede European Investigation Orders under Directive 2014/41/EU or Mutual Legal Assistance procedures to obtain electronic information. Member States’ authorities are expected to choose the tool most adapted to their situation. However, authorities of the Member States will be allowed to issue domestic orders with extraterritorial effects for the production or preservation of electronic information that could be requested on the basis of the e -evidence Framework.

Despite expected improvements in the efficiency of investigations and prosecutions by simplifying and speeding up the procedures, the necessity of having a new legal framework to organize cross-border access to electronic evidence has been questioned.  The proposed e-evidence framework is perceived as adding another layer to the already complex tableau of existing, multiple channels for data access and transnational cooperation.   While alternative approaches have been considered and could have been taken by the Commission, as I have argued in depth elsewhere, a specific framework dedicated to improving access to e-evidence is more suitable to help achieve that goal than amendments to existing procedures and instruments that are general in scope and do not provide for the specific e-information  related challenges. Procedural improvements to existing cross border cooperation instruments are necessary, but not by themselves sufficient to overcome the present difficulties and inefficiencies. It is not possible to adequately respond to novel challenges with old mechanisms embedded in lengthy procedures and bureaucratic complexities. The answer is to provide adequate safeguards that protect fundamental rights and the interests of all stakeholders, suited to the new type of instruments created by the e-evidence framework, albeit not identical to the ones found in existing mechanisms of transnational cooperation.

The E-evidence model builds upon the existing models of cooperation yet is fundamentally different. The extraterritorial dimension of the framework affects the traditional concept of territorial sovereignty and jurisdiction. It departs from the traditional rule of international cooperation that cross-border access to electronic information requires consent of the state where the data is stored.  Most importantly, jurisdiction is no longer linked to the location of data. According to the new approach, the jurisdiction of the EU and its MSs can be established over SPs offering their services in the Union and this requirement is met if the SP enables other persons in (at least) one MS to use its services and has a substantial connection to this MS.  In this way the framework avoids the difficulties in establishing the place where the data is stored and the “loss of location” problem. E-evidence framework is a clear example of the development of the concept of territorial jurisdiction in criminal law and the evolvement of connecting factors that establish it, in line with the requirements of legal certainty.

The extraterritorial reach of judicial and state authorities’ decisions in the E-evidence framework introduces a new dimension in mutual recognition, beyond the traditional judicial cooperation in the EU in criminal matters, so far based on procedures involving two judicial authorities in the issuing and executing State respectively. This important aspect of the e-evidence framework entails a fundamentally different approach that demonstrates the (need for) development of the EU law traditional concepts in order to respond to the new challenges with adequate mechanisms. From the perspective of the proposed e-evidence framework, the scope of article 82 (1) TFEU requires further clarification from CJEU or an amendment (albeit difficult). Reliant on the principle of mutual trust, the debates surrounding the e-evidence framework reveal that in today’s European reality this principle is still an objective to be achieved. For as long as disparities in the standards and protections provided by MSs still exist, the way forward should include innovative mechanisms that allow for the control, improvement and maintenance of those standards within each MS as opposed to fostering lack of trust, prejudicial treatment and unjustifiable differentiation between MSs within the EU.

The e-evidence framework generally achieves what it sets out to do: i.e. to increase the effectiveness of cross-border access to e-evidence. The application of the same rules and procedures for access to all SPs will improve legal certainty and clarity both for SPs and LEAs which is currently lacking under the existing mechanisms of cooperation. In several aspects the framework serves as a model to be followed in the international arena. However, further improvements can be recommended:

  • There should be only an exceptional involvement of the enforcing MS as proposed by the Council, so that the framework does not replicate the existing judicial cooperation models.
  • The wording of Article 7a in the Council draft could be amended to allow for the enforcing MS to raise objections on behalf of any affected state.
  • Service Providers should maintain their reviewing powers of production and preservation orders, given the unique position they are in to understand the data. A productive dialogue and close cooperation between SPs and the issuing authorities should be promoted in the earliest stages.
  • The framework should specify the definition of e-evidence and should provide for its inadmissibility in cases of breaches of the requirements specified therein.
  • The data categories need to be better defined and brought in line with other EU and international legal instruments, as well as the jurisprudence of CJEU and ECtHR. The draft presented by EU Parliament is a positive step in that direction.
  • Judicial validation of orders issued by non-judicial authorities should be imperative for all types of data as a form of control and safeguard against abuse or overuse.
  • A classification of investigating authorities by means of a schedule in the proposed framework would help to better define the permitted activities within the scope of the Regulation.
  • A provision that clearly prohibits the production or use of e-evidence in cases contrary to the ne bis in idem principle should be included in the final draft.
  • The final instrument should adopt the approach proposed by the Commission regarding confidentiality and subject notification with an obligation for the issuing authority to inform the person whose content or transactional data are sought in all cases (even though delays should be permitted).
  • The right to exercise legal remedies should be extended to the enforcing MS and/or the MS of residence of the suspect.
  • There should be provisions that enable defendants or other parties in the criminal proceedings to access or request e-evidence. The accessibility of electronic data to the suspects / defendant’s lawyer should be ensured in order to assert their rights effectively.

If implemented, these recommendations would improve the e-evidence framework by ensuring a balance between effective criminal investigations/prosecutions and respect for fundamental rights. A balanced and principled approach should be at the core of any existing or forthcoming instruments concerning cross-border access to electronic information.

ICO Targets Companies for Seeking to Illegally Make Profit from the Current Public Health Emergency

Photo by Adomas Aleno

Dr. Alexandros Antoniou, Lecturer in Media Law, University of Essex

On 24 September and 8 October 2020, the Information Commissioner’s Office (ICO), the United Kingdom’s independent body established to uphold information rights, imposed fines on two companies for sending thousands of nuisance marketing texts and unlawful marketing emails at the height of the current pandemic.

In September 2020, Digital Growth Experts Limited (DGEL) was issued with a monetary penalty of GBP 60,000 in relation to a serious contravention of Regulations 22 and 23 of the Privacy and Electronic Communications (EC Directive) Regulations 2003 (PECR). The PECR provide for specific privacy rights in relation to electronic communications. They include rules on marketing calls, emails, texts and faxes; cookies (and similar technologies); keeping communications services secure; as well as on customer privacy in relation to traffic and location data, itemised billing, line identification, and directory listings. Under the 2003 Regulations, ICO has the power to impose a monetary penalty of up to GBP 500,000 on a data controller.

The Commissioner found that between 29 February and 30 April 2020, DGEL had transmitted 16,190 direct marketing texts promoting a hand sanitising product, which was claimed to be “effective against coronavirus”. The company came to the attention of the Commissioner after several complaints were received via the GSMA’s spam reporting tool (the GSMA is an organisation that represents the interests of mobile operators worldwide).

In the course of the investigation, DGEL was unable to provide sufficient evidence of valid consent (as required by PECR) for any of the messages delivered to subscribers over the relevant period. The company’s explanations for its practices and the means by which it had obtained the data used for its direct marketing were found to be “unclear and inconsistent”.

DGEL had also used data obtained via social media ads which purported to offer free samples of the product to individuals, to automatically opt them into receiving direct marketing without advising them that their data would be used for this purpose, and without giving them (at the point the data was collected) a simple way of refusing the use of their contact details for direct marketing.

In October 2020, ICO again took action against a London-based software design consultancy, Studios MG Limited (SMGL), which had sent spam emails selling face masks during the pandemic. The company was fined GBP 40,000 for having transmitted unsolicited communications by means of electronic mail for the purposes of direct marketing, contrary to Regulation 22 of PECR.

More specifically, on 30 April – in the midst of the pandemic – SMGL sent up to 9 000 unlawful marketing emails to people without their permission. SMGL did not hold any evidence of consent for the individuals it had engaged in its one-day direct marketing campaign. ICO held that SMGL’s campaign had been made possible by using “data which had been scraped from various vaguely defined sources”.

ICO’s examination also found that SMGL’s director had decided to buy face masks to sell on at a profit, despite the fact that the company bore no apparent relation to the supplying of personal protective equipment (PPE). Moreover, it was impossible in SMGL’s case to determine the total number of individuals whose privacy had been affected, as the company had deleted a database with key data evidencing the full extent of the volume of emails delivered.

During the pandemic, ICO has been investigating several companies as part of its efforts to protect people from exploitation by unlawful marketing-related data processing activities. The ICO Head of Investigations said in a statement that DGEL “played upon people’s concerns at a time of great public uncertainty, acting with a blatant disregard for the law, and all in order to feather its own pockets.” A hard line was also taken in relation to SMGL. The Head of Investigations stated that “nuisance emails are never welcome at any time, but especially when people may be feeling vulnerable or worried and their concerns heightened.”

This article first appeared on the IRIS Merlin database of the European Audiovisual Observatory and is reproduced here with permission and thanks. Read the original article here.

Human Rights Expert Receives Major Funding to Investigate Impact of Algorithms on Democracy

Photo by Ari He

An Essex human rights expert has been awarded major funding to look at the impact of Artificial Intelligence-assisted decision-making on individual development and the functioning of democracy.

Dr Daragh Murray, from the School of Law and Human Rights Centre, is among the latest wave of individuals to receive funding as part of UK Research and Innovation’s Future Leaders Fellowships scheme. Dr Murray has been awarded over £1 million for an initial period of four years, to examine the impact of Artificial Intelligence (AI) assisted decision-making in a range of areas.

Dr Daragh Murray said: “Governments around the world are already using AI to help make important decisions that affect us all. This data-driven approach can offer key benefits, but it also relies on the ever-increasing collection of data on all aspects of our personal and public lives, representing both a step change in the information the state holds on us all, and a transformation in how that information is used.

“I want to look at the unintended consequences of this level of surveillance – the impact on how individuals develop their identity and how democratic society flourishes. Will a chilling effect emerge that changes individual behaviour? And what might the impact of this be? Will the knowledge that our activities are tracked and then translated into government decisions affect how we, for example, develop our sexual identity or our political opinions? Will we all be pushed towards the status quo in fear of the consequences of standing out?

“Ultimately what will the effect of this be on the well-being of our democracy?”

The Future Leaders Fellowships scheme is designed to establish the careers of world-class research and innovation leaders across the UK.

Dr Murray’s project will be interdisciplinary, working across human rights law, sociology and philosophy.

Dr Murray said: “We will be looking at lived experience in the context of wider discussions about how individuals and societies flourish. The intention is to re-imagine the human rights framework to address this very 21st century problem.”

Dr Murray is currently a member of the Human Rights Big Data & Technology Project, based at the University of Essex Human Rights Centre, and the Open Source for Rights Project, based at the University of Swansea. He was co-author with Professor Pete Fussey of the independent report into the Metropolitan Police Service’s trial of live facial recognition, published in July 2019.

He is a recognised expert in the field of Digital Verification, using open source investigation techniques to verify evidence of human rights abuses. He founded Essex Digital Verification Unit (DVU) in 2016 and co-edited Digital Witness, the first textbook in the field, with Sam Dubberley and Alexa Koenig. In 2019, Essex DVU was recognised with a Times Higher Education Award for International Collaboration of the Year, for its role in Amnesty International’s Digital Verification Corps.

The Fellows appoint mentors. In addition to Essex mentors Professor Lorna McGregor and Professor Pete Fussey, Dr Murray will benefit from the involvement of a stellar group of global experts: Professor Yuval Shany, from the Hebrew University of Jerusalem, is Vice-Chair of the United Nations Human Rights Committee, and Deputy President of the Israel Democracy Institute; Professor Ashley Deeks is a Research Professor of Law at University of Virginia Law School, Director of the School’s National Security Law Center and a member of the State Department’s Advisory Committee on International Law; Professor Alexa Koenig is Executive Director of University of California Berkeley’s Human Rights Center and sits on a number of national and international bodies looking at the impact of technology, as well as the board of advisors for ARCHER, a UC Berkeley-established non-profit that “leverages technology to make data-driven investigations accessible, smarter and more scalable.”

Launching the latest round of Future Leaders Fellowships, UK Research and Innovation Chief Executive, Professor Dame Ottoline Leyser, said: “Future Leaders Fellowships provide researchers and innovators with freedom and support to drive forward transformative new ideas and the opportunity to learn from peers right across the country.

“The fellows announced today illustrate how the UK continues to support and attract talented researchers and innovators across every discipline to our universities and businesses, with the potential to deliver change that can be felt across society and the economy.”

This story originally appeared on the University of Essex news webpage and is reproduced here with permission and thanks.

ICO’s Age Appropriate Design Code of Practice Comes Into Effect

Photo by Igor Starkov

Dr. Alexandros Antoniou, Lecturer in Media Law, University of Essex

On 2 September 2020, the Information Commissioner’s Office (ICO), the United Kingdom’s independent body established to uphold information rights, formally issued its Age Appropriate Design Code of Practice which should be followed by online services to protect children’s privacy.

The Age Appropriate Design Code of Practice, the first of its kind, is a statutory code required under Section 123 of the Data Protection Act 2018 and aims to address the increasing “datafication” of children. The Code was first published on 12 August 2020 and, following completion of its parliamentary stages, it came into force on 2 September 2020. The Information Commissioner, Elizabeth Denham CBE, stated: “For all the benefits the digital economy can offer children, we are not currently creating a safe space for them to learn, explore and play. This statutory Code of Practice looks to change that, not by seeking to protect children from the digital world, but by protecting them within it.”

The Code’s primary focus is to set a benchmark for the appropriate protection of children’s personal data and provide default settings which ensure that children have the best possible access to online services whilst minimising data collection and use, by default. It sets out 15 standards on data collection and protection, and reflects a risk-based approach. Section 123(7) of the DPA 2018 defines “standards of age-appropriate design” as “such standards of age-appropriate design of such services as appear to the Commissioner to be desirable having regard to the best interests of children.” The 15 points of the Age Appropriate Design Code include a duty to conduct data protection impact assessments; transparency; policy and community standards; data sharing and minimisation; geolocation; parental controls; nudge techniques; and online tools, among others. For a brief overview of the standards laid out in the Code, see here. Due to the fact that different services will need to implement various technical solutions, the ICO acknowledges that these are not intended as technical standards, but as a bundle of technology-neutral design principles and practical privacy features.

These principles apply to any online products or services (including, for instance, educational websites, social media platforms, apps, online games, and connected toys with or without a screen) that process personal data and are likely to be used by children under 18 in the UK; therefore, they are not limited to services specifically aimed at children. The Code covers entities based in the UK as well as entities based outside of the UK if their services are provided to (or monitor) users based in the UK. Services provided on an indirect charging basis (for example, funded by advertising) also fall within its remit.

The ICO and the courts will take the Code into account in determining whether the GDPR and PECR requirements have been met for the purposes of enforcement action. Although the Code is now in effect, the industry has been given a 12-month implementation period to get up to speed and introduce suitable changes. After a year in force, the ICO will undertake a review of the Code and its effectiveness.

This article was first published in the 9th issue of IRIS Legal Observations of the European Audiovisual Observatory and is reproduced here with permission and thanks.

The Oxford Statement on International Law Protections Against Foreign Electoral Interference through Digital Means

Photo by Joshua Sortino

Dr. Antonio Coco, Lecturer in Law at the University of Essex, has co-drafted The Oxford Statement on International Law Protections Against Foreign Electoral Interference through Digital Means, which has been signed by 139 international lawyers so far.

The Statement is the third in a series — informally known as the “Oxford Process” — aiming to clarify the rules of international law applicable to cyber operations which threaten areas of pressing global concern.

The first Statement (May 2020) concerned the protection of the healthcare sector. The second Statement (July 2020) focused on the protection of vaccine research. The third and most recent one (October 2020) tackles foreign electoral interference, and can be read at EJIL:Talk!Opinio Juris and JustSecurity.

Reforming Cybercrime Legislations to Support Vulnerability Research: the UK Experience and Beyond

CODE BLUE (29-30 October 2020) is an international conference where the world’s top information security specialists gather to give cutting edge talks, and is a place for all participants to exchange information and interact beyond borders and languages. As technology and society move forward and IoT (Internet of Things) is becoming a reality, security is increasingly becoming an urgent issue. The Internet world also needs to gather researchers to collaborate and think together about ways to respond to emergency situations, and come up with possible solutions. CODE BLUE aims to be a place where international connections and communities form and grow, and will contribute to a better Internet world by connecting people through CODE (technology), beyond and across the BLUE (oceans).

This year, Dr Audrey Guinchard (Senior Lecturer in Law, University of Essex) gave a keynote on ‘Reforming cybercrime legislations to support vulnerability research: the UK experience and beyond’.

Cybercrime legislations – or hacking laws- tend to be notoriously broad, resting on a set of assumptions about what ‘unauthorised access’ means, assumptions which hardly match those of the technical or ethical fields. The result is that the offences of unauthorised access and misuse of tools have the potential to criminalise most aspects of legitimate vulnerability research (discovery, proof of concept, disclosure). Independent security researchers are notably at risk of criminal prosecution as they work, by definition, without vendors’ prior authorisation. 

The UK is a particular case in point, having drafted its original Computer Misuse Act 1990 in such a way that even switching a computer on can constitute unauthorised access. Further reforms in 2006 and 2015 have expanded even more the scope of the legislation by modifying or adding other offences as broad in scope as the original ones. While the UK is in that respect an outlier, the EU Directive 2013/40/EU on attacks against information systems as well as the Convention on cybercrime n.185 (which is de facto the international treaty) are not without their own weaknesses, despite serious and effective efforts to restrict the scope of criminal law and protect security researchers.

Prosecution guidelines or a memorandum of understanding between the security industry and prosecutorial authorities are a welcome step to avoid outlandish prosecution of security researchers, but I argue that they are not sufficient to protect them once a prosecution starts. Their motive (and the methods used) to improve security will not constitute a legal argument unless a public interest defence exists.

Hence, Audrey’s proposal to reform the cybercrime legislations (UK, EU and the Convention) by incorporating a public interest defence to cybercrime offences, in particular to the ‘hacking’ offence (unauthorised access). Momentum is certainly gathering in the UK. The Criminal Law Reform Now network (CLRNN) has now released a comprehensive study of the UK Computer Misuse Act with a series of recommendations. It is time to make cybercrime legislations fit for the 21st Century, to borrow the slogan of a significant part of the security industry in the UK endorsing the report and the reform.

To read some of Dr Guinchard’s research papers which formed the background of this research, please see here and here.

Internet Safety Expert Recognised with OBE

Photo by Rami Al-zayat

An Essex legal expert has been recognised in the Queen’s Birthday Honours for her work on internet safety.

Professor Lorna Woods, from our School of Law, has been working since 2017 with William Perrin of the Carnegie UK Trust to develop a workable solution to ‘online harms’, a term that covers a range of internet safety issues. Professor Woods and Mr Perrin are to both receive OBEs.

Professor Woods said: “I am delighted, if a little surprised, by this honour. I’d like to thank Will, of course, but also Maeve Welsh and everyone at the Carnegie UK Trust – without their support, we would not have been able to develop our approach further or undertake the vital, ongoing engagement with those working in this area.

“Recent events have raised new concerns about the role of social media. The need for a statutory duty of care, overseen by an independent regulator, is not going away. In fact, it is more urgent than ever. We look forward to publication of the promised Online Harms Bill, and its consideration in this parliament.”

In October 2017, Professor Woods and Mr Perrin sat down to review the just-published Green Paper on Internet Safety Strategy.

Near-daily stories of bullying, self-harm and extremism had created a febrile debate. The challenge? To reset the online world and reduce the risk of harm.

The pair agreed the government response was inadequate. Drawing on their experience of the sector, they consulted with a range of actors, researched models already in use and started to write.

Across seven co-authored blogs, completed between February and May 2018 (and subsequently collected into a report, with funding from The Carnegie UK Trust), they sought to shift the debate from ”publishing” and the removal of specific content, to harm prevention, developing a detailed plan involving a statutory duty of care, overseen by an independent regulator.

The duty of care approach re-casts social media as a series of “public or quasi-public spaces”.  In creating these spaces, the providers’ goal must be not maximising profit, or engagement, but user safety. The more vulnerable an audience, the greater the responsibility.

At a time of significant public concern, their research has been a game-changer, offering a workable solution, inspiring a national newspaper campaign, rallying civil society groups and influencing lawmakers, at home and abroad.

In December 2019, they published their own draft Online Harm Reduction Bill, to maintain momentum. The draft bill was endorsed by organisations including the NSPCC, 5Rights Foundation, The Institute for Strategic Dialogue and the Royal Society of Public Health.

In January 2020, the authors and the Carnegie UK Trust also supported Lord McNally in the preparation of a short paving Bill to require Ofcom to prepare for the introduction of an Online Harms Reduction Regulator. The paving Bill was introduced into the Lords on 14 January 2020 and is currently awaiting a second reading.

Four Essex graduates have also been recognised in this year’s Queen’s Birthday Honours:

  • Dr Philip Orumwense (MA Political Behaviour, 1991) will receive a CBE for public service. Philip was Commercial Director of IT at Highways England and is recognisesd for his work across the public sector.
  • Sir David Attenborough (Honorary Graduate), has received a GCMC for his services to broadcasting and conservation.
  • Miss Carrie Anne Philbin (BA History, 2002) has received an MBE for services to education, championing diversity and inclusion in computing.
  • Ms Clare Woodman (BA Government & Sociology, 1989) has received a CBE for services to finance in her role as Head of EMEA and CEO of Morgan Stanley & Co. International PLC.

This story originally appeared on the University of Essex news webpage and is reproduced here with permission and thanks.