This is part of a two-part series on recent developments in the facial recognition technology business. The other installment can be found here.
Regulators in Europe are stepping up their scrutiny of live facial recognition technology after recent events drew attention to the ways its use can exceed the scope of the region's data protection laws.
The European Union and the U.K. are separately exploring new guidelines for the tech, which pairs surveillance equipment with visual-recognition software so images can be analyzed, for example by searching a database of suspects for matches — known as remote biometric identification.
Its use by U.S. law enforcement agencies in recent Black Lives Matter protests led to a public outcry with firms including International Business Machines Corp., Amazon.com Inc. and Microsoft Corp. subsequently restricting or pausing sales of their facial recognition products.
Under this backdrop, regulators across the pond are likely to push for specific laws governing the use of live facial recognition, or LFR, lawyers told S&P Global Market Intelligence.
In Europe, remote biometric identification is currently banned under the General Data Protection Regulation, barring specific use cases including when it is of substantial public interest, a European Commission spokesperson said. Considered the most "intrusive" form of facial recognition, its use is also governed by human rights law, the spokesperson said.
The European Commission, the EU's executive body, is investigating possible gaps in regulation to identify "where additional rules would be necessary," the spokesperson said.
Building on its recent investigations into artificial intelligence, including a white paper on the development of trustworthy AI, the Commission will put forward proposals "by early 2021 at the latest," the spokesperson said.
The debate around the use of LFR by law enforcement is at the formative stage, according to Brussels-based data protection lawyer Nathalie Laneret, but is expected to lead to regulation at either the national or EU level.
"We are at the very early stages of this discussion, but we need to ensure that we have a nuanced approach based on risk assessment," said Laneret, director of privacy policy at law firm Hunton Andrews Kurth's Centre for Information Policy Leadership, or CIPL.
In its feedback on the EC's white paper, the CIPL urged the Commission to take "a less rigid" assessment of low- and high-risk AI applications. It is also pushing for a general accountability requirement from organizations that use AI and facial recognition tools. This would allow organizations to implement auditable measures, policies and procedures that meet the requirements of any potential regulation in the digital space, Laneret said.
With the U.K. preparing to leave the EU, it is carving its own path on LFR regulation.
The country's data protection watchdog Elizabeth Denham in October 2019 called upon the government to introduce a statutory code of conduct on the use of the tech by law enforcement. Her statement came in response to a High Court ruling deeming the use of LFR by South Wales Police as lawful after activist Ed Bridges argued that having his face scanned violated his privacy and data protection rights.
Bridges is appealing the decision. Denham said the judgment should not be viewed as a "blanket authorization" for police forces to use face recognition systems in all circumstances because it was a case about a specific deployment. Legal experts took a similar reading. They claim the case will be key to the introduction of a legally binding code of practice that determines how LFR can be used.
"I viewed the court's judgment as being narrow and not authorizing anything more than the use of LFR by the police in a targeted manner," Philip Chertoff of Harvard Law School said. "The police says there is no law that they can't do this, but that is different to a law saying that they can do this."
"The High Court decision made it pretty clear that it should not be taken as a wide approval of the unrestricted use of facial recognition by law enforcement," data privacy and technology lawyer Emma Wright, who sits on the board of the Institute of AI, said. "I think we need some sort of codification of when facial recognition can and can’t be used. I also think it has to have more substance than just a voluntary code."
The government should bring forward some kind of proposal around biometric data including facial recognition, Wright, who frequently meets with officials and legislators to discuss AI, noted. This could ultimately lead to a primary piece of legislation regulating use and supported by a code, she said.
The broader facial recognition market is estimated to reach $15.4 billion by 2024, according to Variant Market Research. While the proportion of sales attributed to law enforcement contracts is unknown — such agreements are often not made public — the balance sheet of leading LFR supplier NEC Corp. gives an idea of the size of the biometrics sector.
The Japanese company is targeting sales of more than $900 million in its biometrics and image analysis business for the fiscal year ending March 31, 2022, a company spokesperson said. NEC has roughly 1,000 biometrics contracts in 70 countries and supplies LFR tools to law enforcement agencies in the U.S., Canada, Australia and the U.K., the spokesperson said.