The UK Information Commissioner is “deeply concerned” about the inappropriate and reckless use of live facial recognition (LFR) technology in public spaces and notes that none of the organizations her office investigated has fully justified its use.
In a blog post published on June 18, 2021, Information Commissioner Elizabeth Denham said that while LFC technologies can “make aspects of our lives easier, more efficient and safer,” the risks to privacy increase when they are used to improve the health of the world Scan people’s faces in real time and in public contexts.
“If sensitive personal information is collected on a large scale without people’s knowledge, choice or control, the impact could be significant,” wrote Denham, adding that while “it is not my job to endorse or prohibit a technology”, but there is a chance to ensure that its use does not expand without due consideration of the law.
“Unlike CCTV, LFR and its algorithms can automatically identify who you are and infer sensitive details about you,” she said. “It can be used to instantly profile you, run personalized ads, or match your image with known shoplifters while doing your weekly grocery shopping.
“It is significant that none of the organizations involved in our completed investigations could fully justify the processing and none of the systems that were put into operation fully complied with the requirements of data protection law. All organizations chose to discontinue or not continue using LFCs. “
Based on its interpretation of data protection law and six separate investigations into LFR by the Information Commissioner’s Office (ICO), Denham has also published an official Commissioner’s Opinion, which is intended to serve as a guide for companies and public organizations wishing to use biometric technologies.
“Today’s statement sets the rules for engagement,” she wrote on the blog. “It builds on our opinion on the use of LFCs by police forces and also sets a high threshold for their use.
“Organizations must demonstrate high standards of governance and accountability from the outset, including the ability to justify that the use of LFCs is fair, necessary and proportionate in each specific context in which it is used. You have to show that less intrusive techniques don’t work. “
In the statement, Denham noted that any organization considering deploying LFR in a public place must also conduct a Data Protection Impact Assessment (DPIA) to decide whether to proceed or not.
“This is because it is a type of processing that involves the use of new technology and typically large-scale processing of biometric data and systematic surveillance of public areas,” she wrote. “Even using LFR on a smaller scale in public places is one type of processing that is likely to hit the other triggers for a DPIA, as detailed in the ICO guidelines.
“The DPIA should start early in the project, before decisions are made about the actual use of the LFC. It should run parallel to the planning and development process. It must be completed before processing with appropriate checks before each use. “
On June 7, 2021, Access Now and more than 200 other civil society organizations, activists, researchers and technologists from 55 countries signed an open letter calling for a legal ban on the use of biometric technologies in public spaces, whether by governments, law enforcement agencies or private actor.
“Face recognition and related biometric recognition technologies have no place in public,” says Daniel Leufer, European policy analyst at Access Now. “These technologies track and profile people in their daily lives, treat them as suspects and create dangerous incentives for overexploitation and discrimination. They must be banned here and now. “
In addition to a complete ban on the use of these technologies in publicly accessible spaces, the civil society coalition is also calling on governments around the world to stop all public investment in biometric technologies that enable mass surveillance and discriminatory targeted surveillance.
“Amazon, Microsoft and IBM have withdrawn from selling facial recognition technology to the police,” said Isedua Oribhabor, US policy analyst at Access Now. “Investors are demanding restrictions on the use of this technology. This shows that the private sector is aware of the dangers of biometric surveillance for human rights.
“But being aware of the problem is not enough – it is time to act. The private sector should fully address the impact of biometric surveillance by stopping the development or development of this technology in the first place. “
The European data protection officer was also very critical of biometric identification technologies, previously called for a moratorium on their use and now pleaded for a ban in public spaces.
At CogX 2021 on regulating biometrics, Matrix Chambers’ Matthew Ryder QC said that while governments and corporations often say they use the technology in limited, tightly controlled circumstances, with no retention or reuse of the data, the legislation often builds comes in a number of exceptions that allow just that.
“The solution to this can be much tougher rules than we would normally expect in a regulatory environment because both governments and corporations are so adept at playing the rules,” said Ryder, adding that this may not be malicious Constant “stress tests” of the regulatory system can lead to use cases that “at first glance would normally not be allowed”.
He added that both regulators and lawmakers must conveniently establish “hard lines” for technology companies that want to develop or deploy such technologies. “I would take the side of tougher regulations, which then become softer, instead of allowing a relatively permissive regulatory view with many exceptions,” he said.