Tech Policy Trends 2020 | The Digital Services Act: The Next GDPR?

Posted on 6th January 2020
 thumbnail (2)  thumbnail (1)
Mike Laughton
Policy Manager
Héloïse Martorell
Policy Analyst

Emboldened by the GDPR, the EU is increasingly confident in its ability to set global standards for technology regulation. With the new European Commission in place, 2020 will be a year defined by the Digital Services Act (DSA). In Brussels and for much of the industry, the question is not whether to hold digital companies accountable for the actions of their users, but rather which parts of the digital sector and to what extent. The process will take time but will be the defining digital regulation of the decade, as it tackles subjects such as the rights of consumers, censorship, the free market, and the responsibility of online platforms. 

The danger for industry and users alike is that, in tackling the serious challenges associated with the propagation of harmful content online – through platforms owned by Facebook, Google and Twitter – the competitive landscape is undermined, and businesses further removed from end-users are caught within the scope. 

The Digital Services Act is timely as European countries have already pushed their own initiatives to regulate online platformsGermany blazed the trail of content regulation with the Network Enforcement Act (NetzDG) in 2017 and is looking to expand provisions. France’s Senate is currently debating its own hateful content regulation (Loi Avia)The UK, now decidedly leaving the EU on 31 January, continues to provide inspiration for its erstwhile partners with the Online Harms White Paper, Age Appropriate Design Code and even through the now-abandoned age-gating proposals of the Digital Economy Act 2017, currently being explored as a model in Paris. 

A pan-EU initiative is welcome, at least in the sense that it will attempt to harmonise 27 regimes. It will regulate social media platforms, search engines, video gaming platforms, and online marketplaces. First and foremost, it will upgrade liability and safety rules for digital platforms, services and products to incentivise companies to remove illegal and harmful content. This form of content can range from hate speech to fraudulent products by third-party sellersCurrent rules on intermediary liability lie within the eCommerce Directive. The 2001 rulebook adopted a laissez faire approach, now outdated for today’s range of services, consumption rates, and the known dangers of the proliferation of user generated content. Under the DSA, platforms will be subject to further obligations in the form of ‘notice and take down’ orders or duty of care’, potentially requiring companies to use upload filters. This must be balanced by the need to avoid Internet intermediaries becoming, in effect, publishers – maintaining safe harbours as much as possible.  

While we do not expect legislation to be complete in 2020, this year will be to a large extent where the lines around the initial proposals are drawn. Businesses need to engage now to ensure that the new Commission understands the plethora of services that they are due to regulateWhile the work will be led by Internal Market Commissioner Thierry Breton, it will become a joint effort across the College. With policy issues like consumer protection, disinformation, workers’ rights in the gig economy, and competition also on the agenda, businesses will need to widen engagement efforts to the cabinets of Didier Reynders (Justice)Věra Jourová (Rule of Law), Nicolas Schmit (Employment), and Margrethe Vestager (Digital and Competition). Meanwhile, businesses must also be aware of the risk of the Digital Services Act becoming a belated Christmas tree bill, where policy-makerin the Council and European Parliament can re-open old arguments concerning copyright, or privacyOf immediate concern to businesses is the expected consultation and communication on the scope of the DSA in the first quarter of 2020, followed by the first legislative proposals in the latter part of the year.  

Both policy-makers and the public have drawn the line after platforms failed to act responsibly out of moral conscience, rather than legal duty. The sector, having now exhausted all avenues for obtaining self- or co-regulatory regimesshould prepare for sweeping regulatory oversight. It would be also a mistake to assume this is merely a problem for the hyperscale online platforms. The proposals drawn up in the Digital Services Act will affect not only the entire technology sector, but potentially any service relying on user-generated content to any extent. 


Back to document archive