Time to ACT(A) on Article 13
It is quite clear now: Article 13 of the proposed copyright directive represents the most serious attack on the Internet in the EU for years. Even leaving aside the fact that Article 13 is incompatible with EU law, the main idea that top sites hosting material uploaded by members of the public should filter everything will have profoundly harmful consequences.
The problem of “false positives” – marking material as infringing on someone else’s copyright when it is not – will be a huge issue, not least because of two factors. One is that it is impossible to codify the subtle and complex rules of EU copyright law into simple algorithms that can be used to filter automatically. The other is that companies will naturally choose to err on the side of caution, preferring to block legitimate material rather than risk lawsuits for failing to stop unauthorised copies. Put these together, and sites will inevitably implement upload filtering using conservative rules that over-block.
This, in its turn, will lead EU citizens to self-censor. Once their uploads are routinely blocked because of false positives marking their original creations as infringing, many will simply give up. A related serious loss concerns the public domain, which will be badly affected, in part because there is no organisation that can help to defend the rights of people to use such material as they wish, and without being blocked for doing so.
The need to filter uploads will act as a serious brake on digital startups in the EU. Even if they are not required to implement an online censorship system immediately, new companies will have the threat of mandatory upload filters hanging over them as they grow. If they flourish, they will effectively be punished for their success by being obliged to put a filtering system in place, or perhaps be required to try to negotiate impossible licences. Why would startups choose to operate under these terms in the EU when they can avoid the problem by setting up a company in jurisdictions with laws better-suited to the digital age? Similarly, why would venture capitalists risk investing in new EU companies, which will be hamstrung by a requirement to filter everything once they grow beyond a certain size?
Another deep flaw with the approach is that the filtering systems will be costly to create. The ContentID filtering system built by Google, and often cited in this context as “proof” it can be done, is the product of over 50,000 hours of coding and some $60 million. There is no way that local EU software houses could afford to risk that kind of money on writing upload filter software. That means that only the largest – and probably US-based – companies will be able and willing to produce them. Even then, it is likely that few such systems will be produced, leading to a monopoly or oligopoly situation for each sector, with high prices that will make it even harder for EU companies to compete globally.
Moreover, the fact that upload filter systems will probably be written by US companies means that effectively the entire online culture in the EU will be controlled by foreign entities. As we know, the use of algorithmic black boxes makes it very hard for anyone – whether members of the public or government watchdogs – to monitor what is happening. In particular, it is almost impossible to spot when bias – whether intentional or not – is being introduced to the filtering. Again, foreign companies will probably choose to apply a maximalist approach to blocking material, rather than risk lawsuits alleging that they did not do enough to stop unauthorised uploads. Without any transparency, it will be very hard to spot and remedy this.
Finally, it is worth noting that allowing foreign upload filters to dominate the EU online world represents a risk to national security. Since everything that is uploaded to major sites will have to pass through the filter, that software will be carrying out algorithmic surveillance on all such material. As well as checking for unauthorised copies, it could also surreptitiously be looking for keywords that trigger secret actions, like covertly sending selected files to other locations. Since upload filters are unlikely to be open source, and thus transparent, it will be very hard to spot this kind of spying.
The above issues aren’t just my opinions. Many experts have weighed in with similar concerns. For example, last September the Max Planck Institute for Innovation and Competition wrote: “Some requirements contained in Article 13 can enable abusive behaviour, thereby threatening freedom of expression and information”. In October, over 50 NGOs representing human rights and media freedom said: “Article 13 of the proposal on Copyright in the Digital Single Market include obligations on internet companies that would be impossible to respect without the imposition of excessive restrictions on citizens’ fundamental rights.” The same month, 56 leading academics warned: “Article 13 [of the Digital Single Market Directive] is incompatible with the guarantee of fundamental rights and freedoms and the obligation to strike a fair balance between all rights and freedoms involved.” On a different note, the European Copyright Society expressed the following concern: “that proposed art. 13 will distort competition in the emerging European information market.”
If it is evident that Article 13 would be disastrous for the Internet in the EU, it is by no means clear what should be done to counter it. Despite a stream of analyses showing how damaging the upload filter will be, the latest proposals from both the Bulgarian Presidency, and from the file’s Rapporteur, display scant appreciation of just how serious the problems are. Similarly, the European Commission shows no willingness to listen to the concerns of experts or the public on the topic.
We have been here before. Six years ago, the Commission was pushing a similarly terrible idea called ACTA – the Anti-Counterfeiting Trade Agreement. It started out as a reasonable attempt to tackle fake medicines and counterfeit aircraft parts. Then, the copyright industry hijacked it, and turned it into one of its perennial attacks on the Internet. Here’s a key sentence from the final ACTA text: “Each Party shall endeavour to promote cooperative efforts within the business community to effectively address trademark and copyright or related rights infringement.” This is essentially a more vaguely-worded version of Article 13. The intent behind both is the same: to turn Internet companies into copyright police, and to force them to censor material with no real safeguards.
As with Article 13, the European Commission refused to listen to repeated explanations from all quarters as to why that clause would be devastating for the online world. Ultimately, people were forced to take extreme measures. In the run-up to the ACTA vote in the European Parliament, tens of thousands of people took to the streets in Europe’s cities, in an unprecedented mass demonstration in support of basic digital rights and against copyright maximalism. MEPs finally realised the seriousness of the situation, and that ACTA would rip the heart out of the online world. In a stunning rebuke to the European Commission and copyright lobbyists, who thought everything had been settled behind closed doors, in July 2012 MEPs threw out ACTA, with 478 voting against, 39 in favour, and 165 abstentions.
If the European Commission continues to ignore the evidence that Article 13 will cause deep and long-lasting damage to the Internet in the EU, it would do well to remember what happened with ACTA. As the Trade Commissioner responsible for the ACTA negotiations, Karel de Gucht, said after his humiliating defeat in the European Parliament, ACTA was “one of the nails in my coffin.” Does the Commission need tens of thousands of citizens to take to the streets of Europe to hammer home the point that they really don’t want Article 13?
Original version of featured image by Tomasz Sienicki.