The Facebook livestream of a terror attack that killed 50 Muslim worshipers in New Zealand was a wake-up call for many leaders on the need for online regulation.
“We cannot simply sit back and accept that these platforms just exist. They are the publisher not just the postman,” said New Zealand Prime Minister Jacinda Ardern, after copies of the Christchurch terror attack were uploaded hundreds of thousands of times to YouTube, Facebook and Twitter.
While executives from big tech firms are due to answer questions about Christchurch in the U.S. Congress soon, the European Union already has a plan. But putting it into action is proving trickier than first thought.
Known as the “terrorist content” legislation, Europes draft law aims to stop violent propaganda like the Christchurch footage from proliferating online. It was drawn up following a spate of Islamic State-inspired attacks on the Continent, and would force platforms to take down illegal content within a set time frame — or face penalties.
British Conservative Daniel Dalton, a member of European Parliament whos in charge of the draft, said this is technically doable, but warned against the mandatory use of automated tools. “Social media companies are generally very good at keeping child abuse content offline,” he said this week. “There is no reason why the same principles shouldnt apply with terrorist content. But that shouldnt involve upload filters, which wouldnt have prevented the Christchurch video going up.”
“We are doing sloppy work on a matter that has huge impact on the lives of citizens and small and medium-sized enterprises” — Sophie int Veld, Dutch liberal MEP
As European parties started to debate the fine print, political rather than technical problems emerged. Arguments erupted over what constitutes “terrorist content,” safeguards for freedom of speech, and how fast platforms should be forced to take down illegal content. A vote in European Parliament that was scheduled for this week had to be postponed until early April as a consequence of the divisions.
In September of last year, following attacks that put a spotlight on internet radicalization, the European Commission — pressed by France, Germany and the U.K. — came up with a regulation that would force platforms such as Google, Facebook and Twitter to take down flagged terrorist content within one hour of it being flagged by authorities. Platforms would also have to implement measures to spot terrorist propaganda proactively via automated tools.
The Commission wanted the final legislation to be adopted quickly, before the European Parliament election in May. Twenty-eight member countries voted within three months to adopt the plan without significant changes.
Thats when the trouble started. Prominent MEPs raised concerns about fundamental rights, foreshadowing divisions between political groups — namely over the question of how quickly “terrorist” content should be brought down.
A young girl shows her emotion during the vigil for the Christchurch mass shooting victims | Dianne Manson/Getty Images
“This file is rushed through,” said Dutch Liberal MEP Sophie int Veld at a Parliament committee meeting in mid-March. “We are doing sloppy work on a matter that has huge impact on the lives of citizens and small and medium-sized enterprises. The Commission is behaving like the secretariat of the member states who want to push this through.”
On one side, the Socialists — who asked for this weeks vote to be delayed — the Liberals and the Greens were pushing for more safeguards for citizens and SMEs, and argued that the law could be misused. “We know that legislation aimed at fighting terrorism is often used for other purposes,” said French MEP Eva Joly, who is negotiating on behalf of the Greens group, in mid-March. “Terrorism legislation in France was used against Greens activists demonstrating against the storing of nuclear waste.”
The lead negotiator Daniel Dalton suggested extending the one-hour removal deadline to eight hours, to try to bridge the gap between the political parties.
On the other side, the center-right EPP group, of which German Chancellor Angela Merkels conservative Christian Democratic Union is a member, is more aligned with the Commission. “I am convinced a removal deadline for deleting flagged terrorist content online of maximum one hour is absolutely necessary,” said Rachida Dati, the French MEP negotiating the text on behalf of the European Peoples Party group. “After the New Zealand atRead More – Source
[contf] [contfnew]