lundi, 23 octobre 2017

(English) There are some things money can’t buy …

Désolé, cet article est seulement disponible en Anglais Américain. Pour le confort de l’utilisateur, le contenu est affiché ci-dessous dans une autre langue. Vous pouvez cliquer le lien pour changer de langue active.

There are some things money can’t buy. For everything else, there’s Audible Magic!

Andrus Ansip, European Commission Vice-President (VP) for the Digital Single Market, addressed the Members of the Legal Affairs (JURI) Committee on 19 June in the framework of structural dialogues between the EU institutions. He touched upon the censorship filter in the EU copyright reform proposal (see our Article 13 infographic), and in the Q&A with MEPs he reacted that content filtering isn’t that expensive as some claim it to be. More concretely, he gave the example of a specific company that has been doing the rounds in Brussels, including visiting VP Ansip, namely: Audible Magic. VP Ansip claimed that such tools cost « 400, 500 bucks » to identify 10.000 songs.

(c) European Parliament – full video

He was also clearly annoyed that everyone is always referring to the fact that YouTube invested more than $60 million in its Content ID technology.

We have another example for VP Ansip, namely SoundCloud, who « spent more than €5 million building its own filtering technology and still must dedicate seven full-time employees to maintain the technology« .

Read more below on why the claims that content filtering technology is ‘cheap’ are flawed even if one only looks at Audible Magic.

 

So, why is this reasoning flawed?

 

1. VP Ansip contradicts the Impact Assessment from his own services

In its own Impact Assessment the European Commission’s claims that these filtering services start from €900 a month. Don’t you think that a start-up could find better ways to spend €900/month? We do, for example to pay an additional engineer, invest in their marketing, etc. We should keep in mind that this amount equates someone’s salary in certain EU countries.

“(…) on the basis of the information available, it is estimated that a small scale online service provider with a relatively low number of monthly transactions can obtain such services as from €900 a month.”
Source: EC Impact Assessment

2. The real cost of filtering tools is much higher

« (…) medium-sized companies engaged in file-hosting services paid between $10,000 and $25,000 a month in licensing fees alone for Audible Magic’s filtering tool. »
Source: ‘The Limits of Filtering’

A March 2017 study [PDF] on the ‘the limits of filtering: a look at the functionality and shortcomings of content detection tools’ counters this figure and the EC’s attempt to downplay the fact that the costs to implement filtering technology would constitute a barrier to entry (see p. 26) as it notes that the €900/month estimate is only applicable for Audible Magic and to filter max. 5.000 music files [so not even video] a month. Audible Magic’s pricing confirms this.
« (…) this estimate is only accurate for an incredibly small number of [online service providers]. Audible Magic’s website indicates that this price only applies to tools that filter audio files and is available only for [online service providers] hosting less than 5,000 song files per month—an incredibly low volume for an [online service providers]. To put this in perspective, when Soundcloud was only five years old, users were uploading twelve hours of audio content every minute. »

Source: Audible Magic

The study reveals that the cost is much higher in reality (see p. 26):

« A recent survey of [online service providers] reported that medium-sized companies engaged in file-hosting services paid between $10,000 and $25,000 a month in licensing fees alone for Audible Magic’s filtering tool. It is worth noting that the licensing fees for the software amount to only a portion of the total costs associated with using fingerprinting software. Any [online service providers’] hosting platform must be altered or augmented to perform the fingerprint lookups and comparisons against a fingerprint database, a substantial software integration task. »

3. There are no filtering tools for all types of content

Tools such as Audible Magic can only handle audio and video files, whilst Article 13 covers all sorts of content, ranging from literary works, music, choreographies, pantomimes, pictures, graphics, sculptures, sound recordings, architectural works, etc.

A concrete example of how this could go wrong: Belgian start-up MuseScore, which allows you to find, play and create sheet music for any instrument, expressed their grave concerns at an event in the European Parliament about the idea that they would have to filter all sheet music being uploaded on their platform, because no such filtering tools exists for other content than audio and video  and they would have to build the technology themselves!

« Although there are fingerprinting tools available to scan and compare audio, video, and image files, no such tools exist to process other forms of copyrightable content »
Source: ‘The Limits of Filtering’

This issue is also highlighted by the authors of the above mentioned study, who stress that even when filters do exist they are never flawless (see p. 2).

« Although there are fingerprinting tools available to scan and compare audio, video, and image files, no such tools exist to process other forms of copyrightable content, such as software. And, since these filtering tools require access to the complete, raw, unencrypted content of files, they cannot process encrypted files or torrents. As a result, the range of infringing activity that filtering tools can effectively address is rather narrow. And, even for media types for which filtering tools exist, such tools are only capable of matching content, not determining whether the use of a particular work constitutes an infringement.

4. There is no link between the content hosted and the type of filter to implement

« As of April 2017, GitHub reports having almost 20 million users and 57 million repositories »
Source: Wikipedia

The EC’s censorshipfilter in Article 13 makes no link between the nature of the work, the type of content filter and the potential harm at stake. In other words, a platform like GitHub which hosts large amounts of code uploaded by its users could have to apply a content filter to its 57 million repositories that needs to identify audiovisual material. Wikipedia, the English version alone having over 5,425,689 articles, would have to filter for all types of material that Wikimedians can upload on their pages.

Moreover, Article 13 does not distinguish been uploaded with the authorisation of the rightholder and those uploaded without their authorisation. So, even content uploaded by rightholders or with their authorisation needs to be filtered. This actually happens: for example, coders use a platform like GitHub to easily share their code and allow others to improve it (and there is no filter for code). Vevo, the YouTube channel where rightholders themselves upload official music videos, has over 250.000 videos!

5. Censorship is not just about software, there are hidden compliance costs

The figures that are being spread neglect all other financial and time investment that go into running these filters. This ranges from putting in place and managing a redress mechanism to handle user complaints to ensuring compliance. Let’s also not forget about the initial setup costs either, as Audible Magic has a one-time setup fee of $2.500.

« This is only a tiny portion of the costs of filtering, as those companies would still have to invest in staff to apply policies and monitor compliance. »
Source: The Hill

Conclusion: Regardless of the price, the cost to our fundamental freedoms is priceless

Listing all these flaws, we can only agree with the authors of the study, who rightfully conclude that « to require [online service providers] to deploy tools that are costly, easily circumvented, and limited in scope would deeply harm startups, users, and content creators alike » (see p. 32).

Startups would be hit twice as hard, compared to established Internet giants such as YouTube and Facebook who already invested heavily in these types of filtering technologies and would get a legislative edge over newcomers, because they would need to explain to a boardroom of potential investors why they will need to massively invest in content filtering. Especially, it does not look good for them, if you consider that a 2014 survey [PDF] found that 60% of investors would be « uncomfortable investing in businesses that would be required by law to run a technological filter on user-uploaded content » (see p. 29). One of the authors notes that:

« Investors would never fund a startup if its business plan required a significant upfront investment in filtering simply so the company could exist. And, because technology changes so rapidly, it would be difficult for a startup to know in advance whether any particular filtering technology would be legally satisfactory in a world with mandatory filtering obligations, casting a cloud of uncertainty over any new platform startup. »

It seems that Audible Magic has set-up a lobbying operation to mandate the use of tools like theirs through legislation, seeing that the Court of Justice of the European Union (CJEU) thwarted their plans with its ruling in the Sabam/Netlog case (C-360/10).

Academics explain [PDF] that the CJEU ruled that requiring a service provider to install such a filtering system would be incompatible with EU law, and point out that imposing an obligation to systematically monitor the contents transmitted by a whole user basis are incompatible with Articles 8, 11 and 16 of the EU Charter for Fundamental Rights and goes against the e-Commerce Directive. The EC tries to ditch this argument by claiming that Article 13 does not impose such a general monitoring obligation, but the academics rebut this claim as they note that (see p. 8):

« (…) the CJEU did not imply in Sabam/Netlog that if the database of protected works is produced in collaboration with rightholders themselves, the general obligation to monitor imposed upon the service provider would thus be transformed into a permissible, special obligation to monitor.« 

Therefore, it should be no surprise that recently 64 civil society and trade associations – representing publishers, journalists, libraries, scientific and research institutions, consumers, digital rights groups, start-ups, technology businesses, educational institutions and creator representatives – sent this open letter to the Council and European Parliament asking them, amongst other things, to refrain from imposing private censorship on EU citizens by filtering user uploaded content and remove Article 13 from the copyright discussions to deal with it in appropriate contexts.

Herman Rucic is Senior Policy Manager in the secretariat of the Copyright 4 Creativity (C4C) coalition. He is Senior Policy Manager at N-square Consulting since September 2010.