BERLIN — Authorities in a major city scan the faces of tens of thousands of passers-by at a vast train station, using software to compare them to photographs of people in a digital database.
Sounds like China? Think again.
Welcome to Berlin Südkreuz railway station, where a German government experiment with facial-recognition technology is raising privacy concerns in a country scarred by a history of oppressive state surveillance.
With its wide-open modern spaces and light flooding through a glass ceiling, Berlin Südkreuz looks like an architectural metaphor for transparency.
But critics say that authorities have been anything but transparent about the trial — launched last year by the German interior ministry — taking place inside.
“The worst part, really, is that so little information is being shared with us” — Constanze Kurz, a computer scientist
In 2017, the ministry recruited around 300 volunteers who agreed, in exchange for a €25 Amazon voucher, to have their names and two biometric photos stored in a database and to carry a transponder around with them. This allowed authorities to know at what day and time they crossed through the station.
On August that year, the project, known as “Safety Station Südkreuz,” took off. Standing in front of TV cameras, then-Interior Minister Thomas de Maizière went into raptures about why “video surveillance is very important,” and how the new systems could eventually help police with tracing criminal and terror suspects.
“On this scale, this is the first real-life test in Germany,” he said. “Im very excited to see the results so I can then provide a solid argument for using such facial recognition in the interest of protecting and securing the population.”
It didnt take long for privacy advocates to cry foul.
A protester scans passers-by at Berlins Südkreuz railway station. Facial recognition software is being tested there, in an attempt to identify terrorists and criminals entering and exiting the station | Imago Agency via Belga
The ministry and its partner Deutsche Bahn have been coy about details; both denied to be interviewed. But information from parliamentary inquiries and internal documents, as well as conversations with industry officials and technology experts, suggest that opposition is gathering momentum.
Critics of the project say it is insufficiently transparent and exposes citizens to invasions of privacy, especially if their data is kept on file.
“The worst part, really, is that so little information is being shared with us,” said Constanze Kurz, a computer scientist and the spokesperson of Chaos Computer Club, Europes largest association of white-hat hackers.
She said that while officials sold the project as merely an updated version of old-school video surveillance, this was an attempt to distract from the fact that everyone who passes through a marked section of the station has their face scanned.
Legal experts add that if the technology on trial in Berlin Südkreuz is ever deployed as part of an official law enforcement effort, it would violate German citizens right to privacy.
“If such technology was to go live, the state would grossly infringe the basic rights of its citizens,” said Lea Voigt of the German Bar Association, which represents more than 64,000 lawyers, adding that “the technology that is tested has the potential to question in principle whether people can move anonymously through public space.”
The firms providing the technology — three companies from France, Spain and Israel — declined or ignored formal interview requests, but conversations with employees under the condition of anonymity suggest that the companies agreed to alter their software so that any data that does not lead to a match would be deleted immediately.
Surveillance states
As facial recognition tools mature, governments around the world — from authoritarian regimes to democratically elected governments — are looking into making them available to their law enforcement.
Most prominently, China has been turning more than 170 million cameras into a gigantic surveillance system; in Europe, countries such as the United Kingdom have also run real-life trials, on a much smaller scale.
In Germany, however, experience with two surveillance states in the 20th century — the Nazi and East German communist regimes — has made people particularly sensitive about protecting their privacy.
“I dont want to withhold certain instruments from our security agencies, just because of the diffuse fear that they could be abused” — Andrea Lindholz, German CSU MP
After the facial recognition experiment concluded in July, the trials second phase inside Südkreuz, to be supervised by private railway company Deutsche Bahn, is set to start in November and will test programs that can spot suspicious behavior in video footage by analyzing it in real-time — potentially opening another Pandoras box of privacy concerns.
The programs will try to spot suspicious objects like unattended suitcases. They will sound the alarm when people enter areas where theyre not supposed to go or lay down on platforms or staircases. They will keep count of how many people are at the station. And they will watch out for unusual group behavior: If a large number of people, for example, abruptly starts to move in one direction, the program might see this as an indicator of a potentially dangerous situation.
Legal scholars and technology experts say such software, which becomes ever-more effective by learning from its own experience, generally causes fewer privacy concerns because it tends to work without identifying people.
Historical amnesia
Not everybody is convinced. A reply to a parliamentary inquiry into the program says that authorities are considering “marking” people and objects as part of phase 2 of the project, allowing them to later reconstruct where a person moved inside Südkreuz.
To do this, it will be necessary to identify people, experts say, which suggests the use of facial recognition software.
Supporters argue that Berlin needs to provide its law enforcement with access to cutting-edge technology so they can fight back against a growing threat by criminals and terrorists.
“I dont want to withhold certain instruments from our security agencies, just because of the diffuse fear that they could be abused,” said Andrea Lindholz, a member of the German parliament for the conservative Bavarian Christian Social Union and the chair of the Bundestags powerful interior committee.
Andrea Lindholz is a firm backer of the technology | Omer Messinger/EPA
Lindholz stressed parliament would have to pass respective laws and decide who can be monitored before the facial recognition technology is deployed as part of a law enforcement effort.
“Those who suggest in principle that our state would misuse any bit of information thats available to it suffer from historical amnesia and places todays Germany on the same level as the GDR,” she said. “We live in a functioning democracy based on the rule of law and with rules that everyone has to respect, which of course includes the state and its authorities.”
Attorney Voigt laughed bitterly when confronted with the argument.
“If you follow this train of thought, then we wouldnt need the Basic Law [Germanys constitution] at all,” she said. “Then you could always say, lets just demand the people to trust the state.” But, she added, “in particular if you dont suffer from historical amnesia and know the history of Germany, you know that theres no reason to have such basic trust in any state.”
Mixed results
Interim results from the concluded facial recognition trial at Südkreuz released by the interior ministry suggest that the experiment has so far performed poorly. Industry officials say that German sensitivities about privacy may be part of the reason why.
Law enforcement has been experimenting with facial recognition for decades, but it is more complicated than it might sound: Faces look different from different angles and in different light. They age. People can put on glasses or hats; men can grow beards or shave them off.
Recent developments in technology, particularly artificial intelligence, have allowed software to better compensate for some of those flaws. But experts warn that, no matter how refined technology becomes, facial recognition software, as we know it today, will only be able to identify people with a certain probability and remain prone to miss a match or identify people incorrectly.
“I consider the first phase as a clear failure” — Florian Gallwitz, facial recognition expert at the Nuremberg Institute of Technology
The published results of various recent real-life cases have exposed its limits: When Welsh police trialed facial recognition during the June 2017 Champions League final in Cardiff, it wrongly identified 2,297 people — 92 percent of all hits — as suspicious. Similarly, the software used by local police in Germanys second-largest city Hamburg to analayze video footage of riots surrounding last years G20 meeting ended up identifying just three people.
Florian Gallwitz, a facial recognition expert at the Nuremberg Institute of Technology, described interim results presented by De Maizière in December 2017 as “not significantly better” than the results of a trial conducted more than 10 years earlier in the city of Mainz — despite the fact that technology has become “dramatically” more accurate at recognizing faces within the last decade.
“I consider the first phase as a clear failure,” he said.
One of the reasons, industry officials suggest, could be that they had to deploy their software under far-from-perfect conditions.
Police in Cardiff ahead of the 2017 Champions League final | Ben Stansall/AFP via Getty Images
An employee of a company involved in the trial, speaking on the condition of anonymity, said that the interior ministry had made it a precondition that they would only use existing video cameras inside the station, without installing new or additional ones. This, however, hindered the software from being fully effective, the employee said: Cameras were too far away from those they were supposed to identify. Their angles made it difficult to scan faces. And, in particular, many of them were directed against the light.
“We wont be able to fully exclude a certain margin of error,” conservative lawmaker Lindholz acknowledged. “But this margin of error has to be as small as possible, so that we can say that, after carefully considering all legal and security aspects, it is justifiable to introduce this instrument within a limited framework.”
She did not specify where the threshold would lie, saying it would have to be decided by parliament.
This article has been updated to clarify quotations attributed to Andrea Lindholz.
Read this next: Pedro Sánchezs 100 days of commotion
[contf] [contfnew]