Search
Close this search box.

Upcoming Platform Regulation: the EU’s Digital Services Act and the UK’s Online Harms Bill

Digital commerce

The activities of platform operators are increasingly subject to scrutiny from users and regulators alike trying to maintain the “surface tension” between freedom of choice and speech and concerns around online bullying, racist remarks, copyright infringement, sale of counterfeit products or other illegal behaviour.

This short piece summarises proposed “policing the internet” regulation in the EU and the UK at a high level and focusses on the new proposals to update the existing “safe harbours” for caching, hosting and mere conduit.

The EU’s Digital Services Act

  1. Introduction. The EU’s draft Digital Services Regulation (the ‘DSA’) is the European Commission’s proposed regulation applicable to online intermediary services. Its aim is to protect consumers and their fundamental rights while also requiring platform operators offering services in the EU to act in a transparent and accountable manner.

Not all intermediaries are subject to the same rules. The number of obligations and requirements increases as the intermediary serves more consumers with an ever-increasing set of services. The proposed ramp up in obligations means that e.g., Facebook or Twitter will be subject to a higher number of more onerous obligations than e.g., BT as your internet service provider, as illustrated by the following table of obligations (see here).

The goal is to ensure that those providers who are used by most individuals are subject to additional rules intended to apply increasingly onerous obligations including transparency as to how the intermediary makes decisions (e.g., to explain why content has been removed, reporting on removal of content, informing users if there are restrictions on use of data and what filtering or moderation techniques are used, and disclosures around ad display and targeting to name a few).

Very large platforms will also be required to analyse systemic harm from the use of their platforms, to allow for audits, to share parameters of decision-making methodologies, to publish details of all ads posted on the platform, to appoint a compliance officer and implement codes of conduct and crisis response protocols. The European Commission will also have regulatory oversight.

 

Obligation Intermediary services (network infrastructure)
(cumulative obligations)
Hosting services (cloud and webhosting services) (cumulative obligations) Online platforms (online marketplaces, app stores, collaborative economy platforms and social media platforms) (cumulative obligations) Very large platforms (platforms reaching more 45 million consumers in Europe) (cumulative obligations)
Transparency reporting
Requirements on terms of service due account of fundamental rights
Cooperation with national authorities following orders
Points of contact and, where necessary, legal representative
Notice and action and obligation to provide information to users
Complaint and redress mechanism and out of court dispute settlement
Trusted flaggers
Measures against abusive notices and counter-notices
Vetting credentials of third party suppliers (‘KYBC’)
User-facing transparency of online advertising
Reporting criminal offences
Risk management obligations and compliance officer
External risk auditing and public accountability
Transparency of recommender systems and user choice for access to information
Data sharing with authorities and researchers
Codes of conduct
Crisis response cooperation

Although asymmetrical, the new rules in the DSA are likely to require most intermediaries to take steps to implement the new rules and to update and refresh existing practices and procedures to meet the new requirements of the DSA. This is likely to be at significant cost to businesses in the short-medium term. The sanctions for non-compliance, however, are significant: up to 6% of the annual global income.

2. The “exceptions” to liability. Given the imposition of more responsibility on intermediaries, the preservation of the safe harbours against liability is welcomed. Helpfully, the DSA re-baselines the exceptions, first introduced in the  E-Commerce Directive, to remove the differences in approach between member states and clarifies that voluntary monitoring by intermediaries does not disapply the exceptions – a point which was not necessarily uniformly applied under the national implementations of the E-Commerce Directive.

  1. Legislative Process. The DSA is currently under review by the European Parliament Internal Market and Consumer Protection committee.

The UK’s Online Harms Bill

  1. Unlike the EU, the UK government has not yet prepared a draft of the Online Harms Bill. However, the government’s intention is to create a new duty of care to (a) prevent the proliferation of illegal content and activity online and (b) ensure that children who use the services are not exposed to harmful content. The duty of care will apply extra-territorially to search engines and service providers anywhere in the world whose users are located in the UK and who host user-generated content and/or facilitate online interaction between users (including public communication channels and services (online instant messaging services and closed social media groups) where users expect a greater degree of privacy). An additional, smaller subset of tech companies will be obliged to report on what they are doing to tackle activity and content that is harmful to adults.
  2. The duty of care will require companies to “take reasonable steps to keep users safe, and prevent other persons coming to harm as a direct consequence of activity on their services”. The list of online harms currently within scope is as follows (see here):
Harms with a clear definition Harms with a less clear definition Underage exposure to legal content
Child sexual exploitation and abuse. Cyberbullying and trolling. Children accessing pornography.
Terrorist content and activity. Extremist content and activity. Children accessing inappropriate material (including under 13s using social media and under 18s using dating apps; excessive screen time).
Organised immigration crime. Coercive behaviour.  
Modern slavery. Intimidation.  
Extreme pornography. Disinformation.  
Revenge pornography. Violent content.  
Harassment and cyberstalking. Advocacy of self-harm.  
Hate crime. Promotion of Female Genital Mutilation (FGM).  
Encouraging or assisting suicide.    
Incitement of violence.    
Sale of illegal goods/ services, such as drugs and weapons (on the open internet).    
Content illegally uploaded from prisons.    
Sexting of indecent images by under 18s (creating, possessing, copying or distributing indecent or sexual images of children and young people under the age of 18).    

Companies who are within scope will be expected to comply with regulatory codes and proactively comply with the new duty of care. This may include taking prompt action following complaints of illegal activity, providing support (via a third party) for users who have suffered harm, preventing dissemination of known terrorist content and supporting the police and prosecutors in pursuing criminals.

A failure to comply may result in a fine of up to the higher of 10% of global turnover or £18 million.

  1. The “exceptions” to liability. The government has reviewed the safe harbour exceptions provided by the E-Commerce Directive. In its opinion, the current regime is “not the most effective mechanism for driving behavioural change by companies. The existing liability regime only forces companies to take action against illegal content once they have been notified of its existence.” It’s likely, therefore that the UK government will introduce specific monitoring obligations for limited categories of illegal content while increasingly the responsibility imposed on the service provider – details on the specifics are scant at the time of writing.

Comment

Although both the UK and the EU are seeking to tighten the rules applicable to online intermediaries, the approach is different – the EU is adopting an asymmetrical model imposing specific and defined obligations, with broad exceptions whereas the UK is proposing to capitalise on the existing English law concept of a “duty of care”, with more onerous monitoring obligations and a potentially narrower set of exceptions. This potential for divergence, made possible by Brexit, may result in complex compliance issues for companies in an area where technology changes frequently and where those companies may need to comply with both UK and EU rules in a way not envisaged pre-Brexit.

 

 

Share:

More Posts

Digital transformation

SCHUFA and Automated Decision Making

Can I still use automated processes to produce outputs, such as scoring? Marija Nonkovic takes a look in light of December’s SCHUFA judgment. https://youtu.be/rIHABI8VlNo

Send Us A Message