wtorek, 19 Marzec 2019

(English) Article 13 is Not Just Criminally Irresponsible, It’s Irresponsibly Criminal

Przepraszamy, ten wpis jest dostępny tylko w języku Amerykański Angielski. For the sake of viewer convenience, the content is shown below in the alternative language. You may click the link to switch the active language.

In a previous editorial, I pointed out that at the heart of Article 13 in the proposed EU Copyright Directive there’s a great lie: that it is possible to check for unauthorised uploads of material without inspecting every single file. The EU has ended up in this absurd position because it knows that many MEPs would reject the idea of imposing a general monitoring obligation on online services – not least because the e-Commerce Directive explicitly forbids it. Instead, the text of Article 13 simply pretends that technical alternatives can be found, without specifying them. The recently-issued “Q and A on the draft digital copyright directive” from the European Parliament then goes on to explain that if services aren’t clever enough to come up with other ways, and use upload filters, then obviously it’s their own fault.

Imposing legal obligations that are impossible to fulfil is deeply irresponsible law-making. But there is another aspect of Article 13 that is even worse: the fact that it will encourage a new wave of criminality. It’s hard to think of a greater failure than a law that increases lawlessness.

The problem arises once more from the flawed idea of forcing companies to install upload filters. Just as EU lawmakers seem unable to grasp the fact that online services will be obliged to conduct general monitoring in order to comply with Article 13, so their lack of technical knowledge means that they don’t understand the tremendous practical challenges of implementing this form of general monitoring.

At least the French government is much more consistent and honest in this regard. It wants to go even further than the agreement it struck with the German government, which ended-up forming the basis for the Article 13 text in the Romanian Council Presidency’s new mandate that was adopted on Friday, 8 February. France wants to remove the references to Article 15 of the e-Commerce Directive, which prohibits Member States from imposing general monitoring obligations, in order to ‘clarify’ that these types of obligations are perfectly fine when it comes to protecting copyrighted material.

Another editorial pointed out some of practical challenges of implementing this form of general monitoring. Article 13 will apply to every possible medium. That means online services will need filters for text, music, audio, images, maps, diagrams, photos, video, film, software, 3D models etc. Material can only be filtered if there is a list of items that must be blocked. So in practice, Article 13 means that any major site accepting user uploads must have blocking lists for every kind of material. Even where they exist, such lists are incomplete. For many areas – photos, maps, software, etc. – they simply don’t exist. Moreover, for much of the content that would have to be monitored, the filters don’t exist either. In another instance of irresponsible, lazy law-making, Article 13 is asking for the impossible.

What will online services do in this situation? The Copyright Directive doesn’t help – it simply says what has to be done, not how to do it. This will incentivise companies to set up systems that are likely to provide the best protection when it comes to the inevitable lawsuits. The primary concern will be blocking with maximum efficiency material that should be blocked, rather than on minimally intrusive approaches that maximise freedom of speech for users. The lack of systems to protect themselves from this liability could also mean that some platforms will use geo-blocking, just disappear, or move away from the EU, and that others will never be created in the first place in Europe.

This emphasis will encourage the setting up of systems that allow anyone to submit claims on material, which will thereafter be blocked. Adopting this system, companies will be able to handle material for which general blocking lists do not exist, and will thus be able to avoid liability for unauthorised uploads. As well as being the only practical way to handle the enormous challenge of filtering every kind of copyright material, it also has the virtue of being an approach that has already been used elsewhere, albeit on a smaller scale.

For example, YouTube allows anyone to claim that they are the copyright holders for material that has been posted to Google’s service, and to cause it to be removed automatically. The negative consequences of that feature were discussed previously; suffice it to say that legitimate material is often taken down by mistake, and that appealing against those decisions is hard and time consuming, and the results are very unpredictable. The same will inevitably happen with Article 13’s upload filters, with the added twist that material will be blocked even before it is posted, whereas the automated takedown system created by the US Digital Millennium Copyright Act (DMCA) only operates after material is posted online. However, a recent story on TorrentFreak reveals another disturbing possibility:

In a terrible abuse of YouTube’s copyright system, a YouTuber is reporting that scammers are using the platform’s “three strike” system for extortion. After filing two false claims against ObbyRaidz, the scammers contacted him demanding cash to avoid a third – and the termination of his channel.

Under Article 13, it doesn’t even require three strikes: if your upload gets caught by the filter, your material will be blocked forever. The thinking seems to be that it doesn’t matter if mistakes are made, because people can simply appeal. But as we’ve noted before, appeals processes are slow, don’t work and aren’t used by ordinary people, who are intimidated by the whole process. So even the threat to claim material will be much more powerful with Article 13 than it is with YouTube.

This means that no one can guarantee that their content will make it online in the first place, except for the big (US) rightholders who will have forced the major (US) platforms into agreeing to licencing deals. If your content does actually make it past the upload filter, then you will still run the risk of being extorted by copyright scammers abusing the system. The stay-down obligations of Article 13, implying that copyrighted material that has been flagged by rightholders (or scammers) can never be re-uploaded, makes attempts to claim material or get something back online even harder under Article 13 than they currently are with YouTube.

This is particularly bad news for new artists, who desperately need exposure, and don’t have deep pockets to pay for lawyers to deal with these kind of problems, or spare time to do it themselves. More established artists will lose revenue all the time their material is blocked, so they too may decide to pay up to scammers issuing false copyright claims. This new threat will also see activists’ use of sites allowing public uploads badly affected: many online campaigns are tied to particular events or days. They lose most of their power if they are delayed by weeks or even days, which appeals processes will certainly involve. Far simpler to pay the blackmailer.

This problem exposes a further flaw in Article 13: there are no penalties for falsely claiming to be the copyright holder for material, and causing legitimate uploads to be blocked by upload filters. That means there are almost no disincentive to sending out thousands – maybe millions – of automated threats to artists, activists and others. It’s true that this is extortion, and thus illegal. But given how overwhelmed police forces are today, is it likely that they will allocate scarce resources to chase down phantoms across the Internet? It’s easy for people to hide behind fake names and temporary accounts, and to use anonymous payment systems like Bitcoin. With enough time and effort it might be possible to establish who is behind them, but if the sums demanded are low, the authorities simply won’t bother.

In other words, the poorly thought-out nature of Article 13’s upload filters risks creating a new class of “perfect” online crime. It’s one that can be conducted by anyone with an Internet connection, from anywhere in the world, and one that is practically risk-free – a seductive and deadly combination. Far from helping artists, the Copyright Directive could create a massive new obstacle to their success.

Featured image by Sheila Sund.

Writer (Rebel Code), journalist, blogger. on openness, the commons, copyright, patents and digital rights. [All content from this author is made available under a CC BY 4.0 license]