samedi, 18 août 2018

(English) #CopyrightWeek: Censorship & the Big Little Lies of the EU © Directive

Désolé, cet article est seulement disponible en Anglais Américain. Pour le confort de l’utilisateur, le contenu est affiché ci-dessous dans une autre langue. Vous pouvez cliquer le lien pour changer de langue active.

[Note: This analysis is being published in the context of the 2018 #CopyrightWeek, in light of the theme of 18 January on ‘Copyright as a Tool of Censorship », which revolves around the idea that: « freedom of expression is a fundamental human right essential to a functioning democracy. Copyright should encourage more speech, not act as a legal cudgel to silence it ».]

The discussions at EU level on the review of EU copyright are dangerously intertwining copyright and the E-commerce Directive (ECD). Looking specifically at the proposal for Article 13 of the Copyright Directive in the Digital Single Market (DSM), aka the Censorship Machine, one can’t stop hearing the lyrics of ‘Killing me softly’ in the background when thinking of what copyright is doing to freedom of expression and how it is making censorship the norm. Though the softly may have to be requalified in light of the latest document issued by the Bulgarian Council Presidency a couple of days ago (more on those below).

A quick word on procedure: a Bulgarian ‘coup’?

On 1 January 2018, Bulgaria took over the Presidency of the Council of the European Union from Estonia (these handovers happen every six months). The latter had set a brisk pace of working group meetings with copyright attachés and national experts to redraft the various articles in the Copyright Directive. This led to a variety of iterations of Article 13 (see our blog posts here and here), the latest version being over 7 pages if one adds the Recitals to the mix. The result as presented by the Estonians in December is a monstrosity (see our analysis here) of new definitions and principles, with carve-outs, clarifications, contradictions, all packaged under the interesting caveat by the Estonians at the end of their Presidency that:

« At this stage we have chosen not to provide for an explicit clarification as to whether the online content sharing services are excluded from Article 14 ECD as a number of questions remain open and further analysis is needed as to whether and how we can reconcile the different wishes that the delegations have expressed« 

Talk about the Elephant in the room!

Anyway, as every proposal opened a new Pandora’s box of questions and complexities, the Bulgarians, it seems, have decided to go for the steam roller approach by removing (most) of the technical experts out of the room and push for a decision at a ‘political level’ on the most complex issues under discussion. In other words: ‘move over expertise and common sense, let the horse trading begin!‘.

A more healthy alternative would surely have been to take a step back, consider all the questions that escaped Pandora’s box over the past six months, and decide that maybe more analysis was needed in light of the weakness of the European Commission’s (EC) impact assessment and the growing concerns of all stakeholders.

On 30 November over 80 organisations – representing human and digital rights, media freedom, publishers, journalists, libraries, scientific and research institutions, educational institutions including universities, creator representatives, consumers, software developers, start-ups, technology businesses and Internet service providers –  sent a letter listing 29 prior letters which voiced their concerns. Since then, the telecoms operators represented by ETNO and GSMA also entered the fray, highlighting notably the need to preserve the prohibition of a general monitoring obligation that stems from Article 15 ECD).

Making intermediaries directly liable for the content their users upload: the brutal switch from secondary to primary liability or how Censorship becomes the norm

Article 13, as initially proposed by the EC was bad in many ways (see the background section below) but somehow attempted to give the impression that (some of) the ECD’s core principles might still survive.

With the Bulgarians putting to a political poll questions, such as ‘ should there be a clarification in Article 13 that service providers that store and give access to user uploaded content perform, under certain conditions, an act of communication to the public’, the debate takes on an entirely new dimension.

To fully understand what’s at stake and why this goes way beyond copyright, one has to understand the key building block intermediary liability has been built on so far under the ECD’s provision. An October 2017 study by the European Parliament’s Policy Department  gives a good summary of the situation created by the ECD: ‘The eCommerce Directive (2000/31/EC) provides certain categories of Internet intermediaries with a limited exemption from secondary liability, i.e., liability resulting from illegal users’ behaviour’.

The reasons why this exemption was put in place, are multiple, as set out by the study:

  1. on the one hand, to promote the activity of the intermediaries and preserve their business models; and,
  2. on the other, to prevent ‘excessive collateral censorship (i.e., preventing the intermediaries from censoring the expressions of their users). The last rationale pertains to the fact that secondary liability could induce intermediaries to excessively interfere with their users: the fear of sanction for illegal activities of the users could induce intermediaries to impede or obstruct even lawful users’ activities‘.

In other words and to make it simple: if websites hosting user uploaded content are deemed to be communicating that content to the public, they become directly liable for that content and will hence (1) censor any remotely doubtful content (including legal content), and/or (2) only accept content from big companies willing to sign a contract stating they will come to the (financial) rescue in case of conflict with rightholders.

Both outcomes would mean the end of the Internet as we know it, and an alternative that is based on big companies vetting the business model of other big companies.

Putting an End to Copyright’s Big Little Lies: the red line must be the preservation of the E-commerce Directive and safeguards against censorship

The debate so far has been filled with ‘Big Little Lies’, the biggest being that the ECD was not under threat, whilst it very much and increasingly is, as each version of Article 13 morphs into a more dangerous variant. Such a threat targets all EU citizens and businesses in their daily behaviour: will we as citizens still be able to share our kids videos, our ideas of Ikea Hacks, our open software codes, our brilliantly tasty recipes…and will our businesses be able to compete with the existing (mostly US based) established companies that might have pockets deep enough to face the risks this whole discussion entails.

More importantly: will the European Union be faced with a ‘Chinese Great Firewall’ of its own making, put there for the sake of a few rightholders and to the detriment of the general interest? Surely, that cannot be what the EU is about?

As we set out in our Christmas story, we think it is not too late to rekindle with legislative wisdom. The route forward could be to start again, based on the EC’s initial proposal and to work away all the legal uncertainties that it comprises, by clarifying that:

  1. the intermediary liability regime set in place by the ECD still fully applies: no direct liability gets introduced for copyright, either directly or through a spectrum of confusing provisions which at best remove all legal certainty and at worst expressly kill the ECD. This does not mean the ECD should not be looked at in the future, but that it can only be modified after a thorough analysis of the impact of such changes on the entire Internet ecosystem, based on input from all affected sectors and all relevant experts.
  2. intermediaries must ensure that they remove expeditiously any copyright infringing material brought to their attention.
  3. content recognition type  technologies cannot be imposed by law, either because a provision requests them or because compliance with a provision can only be achieved through them.
  4. where content recognition technologies have been implemented, user safeguards in terms of complaint procedures and judicial redress will be guaranteed and enforced.

See, simple!

 


 

Background: Summary of the issues at stake in the European Commission original proposal on Article 13

Article 13 establishes an obligation to put in place privatised censorship of all content by an undefined number of online companies following an undefined procedure.

There are so many loose and dangerous ends in the proposed Article 13, that we have tried to summarize them in the table below:

The text proposed by the European Commission What does this mean in practice?
Information society service providers that store and provide to the public access to large amounts of works or other subject-matter uploaded by their users shall, Who?

Online players that store large amounts of user uploaded content can cover a lot of very different type of players, ranging from commercial platforms to non-profits and can cover all types of hosted content ranging from:

·         videos (YouTube, Vimeo, Daily Motion),

·         blogs (Tumblr, WordPress),

·         crowdsourced information (Wikipedia),

·         social media (Facebook, Twitter),

·         documents (DropBox, Google Drive),

·         pictures (Flickr), etc.

What?

This covers all sorts of creations, ranging from literary works, music, choreographies, pantomimes, pictures, graphics, sculptures, sound recordings, architectural works, etc..

→ So this is not confined to Content ID type softwares used on YouTube, which only scan music and video uploads to identify copyright infringements.

It covers also content uploaded by a user who is the rightholder of that content or who has the right to do so under an exception or limitation under EU law, as there is no mention of the fact the content has been uploaded there rightfully or not.

in cooperation with rightholders, Who?

Rightholders covers a vast reality ranging from big labels or the Hollywood studios to every individual creator if he has not signed away his rights. This is a lot of people to sit around a table and ‘cooperate’ with, especially if you are a smaller company that would prefer to hire engineers than lawyers. Online companies could have to deal with thousands of claimants all wanting a share of their revenues, or simply face the prospect of such interactions and hence have a less attractive business case to defend before investors.

Collecting societies could maybe be used to decrease the number of stakeholders involved, but they are not always known as smooth negotiators and do not necessarily represent the interests of all rightholders.

Cooperation?

What does that even mean when your interests are not necessarily aligned? And where are the users in this relationship?

take measures
to ensure the functioning of agreements concluded with rightholders for the use of their works or other subject-matter To do what?

The obligation here is to comply with an agreement of the rightholder, regardless if that agreement relates to actual copyright infringements or not. It also implies that online platforms ‘use’ the works that are uploaded by their Internet users, a qualification which is not that clear-cut from a legal point of view and is aimed at pretending they are not just ‘hosting’ the material.

or
to prevent the availability on their services of works or other subject-matter identified by rightholders through the cooperation with the service providers. The works and other subject matters to be ‘filtered’ are those identified by rightholders. How that identification occurs is not stated, nor how claims of rights are checked (it is not unusual for several people to claim they have the rights over the same work, and in some cases, all of their claims are true).
Those measures,
such as the use of effective content recognition technologies, This language seems to point directly towards the type of ContentID software used by YouTube, even though the scope of what needs to be recognized goes dramatically beyond what ContentID is capable of handling.

Moreover, such automated tools can only match a file to another, and do not have the capability to recognize more complex issues, such as the fact that whilst a copyright protected file might have been used by a user, it does not infringe the rightholder’s copyright as it falls under an exception recognised by law (for example, parody).

shall be appropriate and proportionate.

 

Safeguards?

Not really. Seeing all of these measures will be (1) decided by private companies and (2) fall under the terms and conditions of the websites, the ‘appropriate and proportionate’ nature of the implemented measures is left to the appreciation of those private companies, with no control by judicial or administrative instances, nor by consumer representatives. This interpretation seems confirmed by Recital 39 of the proposed Directive.

The service providers shall provide rightholders with adequate information on the functioning and the deployment of the measures, as well as,
when relevant, Who will judge relevance? The rightholders is our best guess.
adequate reporting on the recognition and use of the works and other subject-matter. So aside from investing money into censorship tools, online companies must also make sure they come up with reports to please the rightholders.

What others are saying

 

Caroline is coordinator of the Copyright 4 Creativity (C4C) coalition. She is also the founder and Managing Director of N-square Consulting (N²), a Brussels-based public affairs firm. She is the author of ‘iLobby.eu: Survival Guide to EU Lobbying, Including the Use of Social Media’.