Complete freedom of speech is achievable

Why Internet companies' upload filters are a threat to freedom of expression

In the future, Facebook, Twitter, Microsoft and Youtube will store and exchange digital “fingerprints” of images of terrorism glorifying violence as well as terrorist recruitment videos and images in a joint database. The companies announced this in a joint press release on Monday. The platforms are responding to criticism that they are doing too little against the spread of “terrorist” and “radicalizing” content in their networks.

The content recognized as undesirable should be provided with specific digital fingerprints, so-called "hashes", which only allow conclusions to be drawn about a specific photo or video. By exchanging the data, the affected images and videos are to be identified more quickly on other platforms, thus accelerating the deletion of “terrorist propaganda”.

Facebook & Co. define what terrorism is

Since “terrorism” is generally difficult to define and the companies use different guidelines or general terms and conditions, there is no automatic deletion on all four platforms. Rather, each company decides for itself what should be done with a correspondingly recognized content as soon as the filter hits.

Nevertheless, they announced that they would include such content in the censorship database that "most likely violates all content guidelines" of the respective companies. Particularly in the case of “terrorist propaganda”, a clear definition of the content should hardly be possible. At the same time, such a broad definition basically leaves open what will be on the censorship list in the future.

Federal Minister of the Interior Thomas de Maizière is likely to be pleased with the company's announcement. Only in August, after a visit to Facebook Germany, he had reiterated his demand for an upload filter for platforms. The providers should not only react to prohibited content by subsequent deletion, but would also need a mechanism that filters out the content as soon as it is uploaded. In terms of content, this applies to the project of Youtube, Twitter, Microsoft and Facebook. The upload itself shouldn't be interrupted, but all content will probably only be checked after the upload. If there is content in the database, publication on the platform may then be prevented.

Internet companies as legislators, judges and executioners

The installation of upload filters against “terrorist” and “radicalizing” content poses a serious problem for freedom of expression. “The limit for the distribution of content is drawn by criminal law. Therefore, courts and not online platforms have to decide on the deletion of content. ”Volker Tripp from Digitale Gesellschaft e. V. "With the proactive filtering of the uploads, constitutional procedures are in fact privatized and a dangerous censorship infrastructure is built up," the lawyer further criticized.

The existence of the upload filter will also arouse further desires and invite you to expand the content to be deleted. What concerns the contents of the Islamic State today, can meet environmentalists or critics of capitalism tomorrow. An expansion to other than the originally targeted content has so far been observed in all censorship infrastructures. The censorship mechanism that has now been set up will also remain completely non-transparent, even if the companies announce otherwise.

Will state institutions soon be able to use the technology?

This lack of transparency can already be seen in our specific inquiries about the new database at the respective press offices. None of the four companies gave a specific answer as to whether state institutions will also feed the censorship database in the future. And this despite the fact that in April it was said that Europol was interested in such a reporting point and that precisely such cooperation is being worked on at EU level with the “EU Internet” forum. Instead, there were non-responses from the press offices or references to the general terms and conditions or the press release, which was already online. Partly garnished with the announcement that one should not quote these bloodless answers because it would be about "background".

State cooperation in this infrastructure would bypass elementary principles of the rule of law. Joe McNamee, executive director at civil rights group EDRi, told us, “It is another step towards a situation where the internet giants become lawmakers, judges, juries and hangmen over our free speech. So it means a development towards a completely private law enforcement regime. "

Joe McNamee mentions the planned EU copyright reform as an example of this development. With it, all platform operators are to be obliged, following the example of YouTube's ContentID, to filter all uploads and to check them for copyright violations.

Serious threat to freedom of expression

At the same time, there are efforts being made by Germany and France to weaken the host provider privilege. So far it has been the case that platforms and hosters first have to be made aware of illegal content - and only then have to remove it. The reverse would mean that the platforms and hosters would be directly liable for what people put on their servers. This will mean that the operators will check the content before the upload. It can be assumed that the operators will adjust their censorship mechanisms tougher than necessary in case of doubt in order to prevent later liability. This is a mix that can seriously jeopardize the fundamental right to freedom of expression.

Instead of these measures, Volker Tripp suggests that “the companies should finally provide the courts and investigative authorities with specific persons in charge in Germany so that criminal content can be deleted and prosecuted quickly and effectively. Instead of being fobbed off with mere lip service from Facebook & Co., the federal government should finally nail it down and anchor a corresponding obligation in the Telemedia Act. "

Would you like more critical reporting?

Our work at netzpolitik.org is financed almost exclusively by voluntary donations from our readers. With an editorial staff of currently 15 people, this enables us to journalistically work on many important topics and debates in a digital society. With your support, we can clarify even more, conduct investigative research much more often, provide more background information - and defend even more fundamental digital rights!

You too can support our work now with yours Donation.

About the author

Helena Piontek

is currently training as a journalist and will be an intern at netzpolitik.org until the end of the year. She can be reached at [email protected] Twitter
Published 12/6/2016 at 9:40 PM