(English) MEP Voss’ proposal on the censorship machine (art. 13): not ‘die beste Idee’ either
In an interview related to the press publisher’s right (Article 11 of the proposed Directive on Copyright in the Digital Single Market) given to Golem.de, MEP Axel Voss, the Rapporteur on the file in the lead European Parliament Committee, stated that this new right was not ‘die beste Idee’ but that he couldn’t think of another one.
With the leaked proposal of his ‘compromise’ amendment on Article 13, aka the ‘censorship machine’ or ‘value gap’ proposal, he seems to be replicating a pattern: faced with a bad idea, he makes no effort to improve it and merely copy-pastes the worst bits of all worlds to give the impression of listening to everyone. Definitely ‘nicht die beste Idee’! Especially considering two European Parliament committees (including one that co-leads on this article) previously proposed alternatives that created a much better balance than the initial European Commission text (see our analysis of the IMCO and LIBE Committee Opinions).
The result: a ‘mille-feuille’ monstrosity. For those not privy to this culinary delight, ‘The mille-feuille, vanilla slice or custard slice, also known as the Napoleon, is a French pastry whose exact origin is unknown.», according to Wikipedia (one of the platforms that falls under the Article 13 provisions). Wikipedia adds: «Traditionally, a mille-feuille is made up of three layers of puff pastry (pâte feuilletée), alternating with two layers of pastry cream (crème pâtissière), but sometimes whipped cream or jam are substituted, when substituted with jam this is then called a Bavarian slice. »
So what’s on offering in MEP Voss’ Bavarian slice?
Layer 1: Thou shalt licence all content – whatever a platform does, it will be directly liable for ‘communicating to the public’ the content uploaded by its users (Art 13, 1a)
Under the newly added paragraph 1a of Article 13, websites and apps that allow users to upload content must acquire copyright licenses for that content.
Concretely, that means these platforms are considered to be ‘communicating to the public’ the content that has been uploaded by their users and are hence directly liable for any content that would infringe copyright rules, as if they had uploaded it themselves. If a license is obtained, it would then cover the user too, only if he uploaded the content for ‘non-commercial’ purposes. In other words, rightholders can still sue commercial users with social media account, on top of the platforms.
Even the European Commission did not dare suggest such an impossible obligation to license all content, as it is unfeasible to comply: who do you licence what from? Not every copyrighted piece of content has collecting societies that represent its rightholders. In her initial reaction to this proposal, MEP Julia Reda notably mentions GitHub, the platform dedicated to hosting software. But the same is true for a multitude of services covered by this sweeping provision, as illustrated by EDiMA in the infographic below. Just imagine the blogging platform WordPress having to licence all the content uploaded by its bloggers? And what about massive open online course (MOOC) platforms? And recipe sharing platforms (honestly, don’t touch the food ones!)?
Looking at the variety of content these platforms accept, the sheer impossibility of the licensing obligation is obvious, and the threat to platforms in the EU evident.
It is also important to note that whilst MEP Voss mentions some carve-outs to the filtering obligation comprised in the rest of the Article (namely for internet access providers, online marketplaces, cloud services that do not allow uploads to be publicly accessible, and research repositories), these carve-outs do not concern the licensing obligation itself. And how better to ensure no unlicensed content appears on a platform than by filtering it (this is truly becoming the ‘copyright circle of life’)?
Layer 2: On top of licensing, thou shalt filter
Removing ‘content recognition’ specific wording does not mean the obligations imposed in a text do not entail the use of automated filters. And replacing ‘large amounts’ of content by ‘significant’ does not change the ball game much: it actually makes it even more subjective (after all, when one mentions ones ‘significant other’, that is a very subjective criteria that does not necessarily imply a size of that person). It certainly still does not link the notion of a potential harm to copyright holders with the need to intervene.
Moreover, whilst content recognition is no longer explicitly stated in the Article itself, it is still mentioned in the accompanying Recitals, so I guess the elephant in the room is difficult to fully hide.
Combined with the provisions on direct liability in paragraph 1a, this filtering obligation will clearly be implemented by platforms in the most conservative manner, as companies tend to minimise any risk of liability. The chilling effects on speech and creativity in Europe will be tremendous, as will be the effect on other countries that will use this new ‘censorship model’ proposed by the EU as a great opportunity to justify their own censorship practices.
Layer 3: When looking at everything to find something, one is not looking at everything
That argument is so weak that we do not want to spend too much time on it. It basically relates to the fact that the e-Commerce Directive prohibits Member States to impose ‘general monitoring’ obligations on hosting providers. This prohibition was further clarified in different Court of Justice of the European Union (CJEU) cases, but this Article is desperately trying to bypass it through a variety of tricks, which are not even that innovative as the European Commission used that line of argumentation in the Scarlet Vs SABAM case in 2010 and got slammed by the CJEU (see CJEU Case C-360/10).
The fallacy is as follows: to pretend that no general monitoring occurs, one states that the monitoring is specific due to the fact that it looks for specific infringements. This line of thought totally ignores the fact the ‘general’ nature of the monitoring does not relate to the ‘what’ is monitored but to the ‘who’. In other words, when the obligation put forward requires monitoring all user uploads to find some specific piece of content, that is still a general monitoring obligation. This view was also confirmed in an analysis from the Max Planck Institute, that was prepared in light of the concerns expressed by a number of Member States.
The jam component: some sweet words about fundamental rights to make everyone feel better
After having proposed all of these principles, MEP Voss probably thought that the pill was a bit too bitter to swallow, and that’s where the jam component kicks in: a sprinkle of fundamental rights, a mention of privacy, some wishful thinking on complaint processes, some regrets about filters blocking lawful content based on exceptions… All very nice and well, but insufficient to counter-balance the core of the provision and its threat to EU citizens and businesses alike.