(English) #CopyrightWeek: Online Platforms’ Catch 22 with the EU Data Protection Regulation
|[Note: These infographics are being published in the context of the 2018 #CopyrightWeek, in light of the theme of 19 January on ‘Safe Harbors », which revolves around the idea that: « safe harbor protections allow online intermediaries to foster public discourse and creativity. Safe harbor status should be easy for intermediaries of all sizes to attain and maintain ».]|
The Save the Link Campaign created the two infographics above in the context of the 2018 #CopyrightWeek, in light of the theme of 19 January on ‘Safe Harbors », which revolves around the idea that: « safe harbor protections allow online intermediaries to foster public discourse and creativity. Safe harbor status should be easy for intermediaries of all sizes to attain and maintain ».
The EC’s proposal for a Directive on Copyright in the Digital Single Market is trying to force ‘voluntary’ private policing and filtering duties on all online platforms for all user uploaded content, under Article 13. The wording is carefully picked, as it leaves online platforms no other choice than to implement content filtering technology, because they are being obliged to ‘prevent’ the availability of certain content. Preventing that this content becomes available on their platform can only be achieved by filtering it the source, namely during the upload. As such, the EU legislators are trying to coerce online platforms to take ‘voluntary’ filtering measures, whilst going scot-free because they by-pass the safe harbour provisions in Article 15 of the e-Commerce Directive which prohibits Member States from imposing general monitoring obligations.
All online platforms are thus forced to implement automated upload filters that will decide about if a users’ content will go up or not. This implies that this proposal is set to clash with another piece of legislation, namely the new EU data protection law: the General Data Protection Regulation (GDPR), which will enter into force in May 2018. Article 22 of the GDPR aims to protect users against the impact of automated decisions. There is, however, an exception when these measures are being mandated by the government. But, as explained above, the Member States cannot impose such an obligation, as this would go against the e-Commerce Directive.
So, copyright legislation is turning into a Catch-22 situation (i.e. a situation where one is faced with contradictory rules) for online platforms:
- one the one hand, Article 13 forces all online platforms to ‘voluntarily’ filter all user uploads, as this is the only solution to ‘prevent’ certain content from being uploaded, but this is in breach of Article 22 of the GDPR; and,
- on the other hand, when ‘voluntarily’ filtering this content online platforms cannot benefit from the exception of Article 22 of the GDPR for measures imposed by governments, as the Member States are not allowed to impose such a filtering obligation under Article 15 of the e-Commerce Directive.
The conclusion: if this filtering obligation is mandatory it violates the e-Commerce Directive, if its voluntary, it violates the GDPR. Online platforms are thus caught between a rock and a hard place, violating either Article 13 of the copyright Directive or Article 22 of the GDPR.
Moreover, as depicted by these infographics, content filtering solution:
- are imperfect – as evidenced by Sebastian Tomczak’s (
@littlescale) recent White Noise copyright case;
- fosters self-censorship; and,
- are expensive – and most of all for startups.
More flaws of Article 13 are described here .
Read also more about the need to preserve the e-Commerce Directive, and about the tensions between the copyright reform and the GDPR.
|These infographics were created by Ruth Coustick-Deal and Marianela Ramos Capelo of OpenMedia, and are made available under a Creative Commons BY-NC-SA 4.0 licence.|