UK House of Lords Calls For Legislation on Facial Recognition Tech

Written by

The UK parliament’s upper chamber has said it is “deeply concerned” about unaccountable police use of live facial recognition (LFR) tech and called on the government to legislate.

Baroness Hamwee, chair of the House of Lords Justice and Home Affairs Committee, wrote to the home secretary on Friday to raise her concerns about the AI-powered technology.

“The committee accepts that LFR may be a valuable tool for police forces, but we are deeply concerned that its use is being expanded without proper scrutiny and accountability,” she wrote.

In 2020, the Court of Appeal ruled that South Wales Police had acted unlawfully in its use of LFR, breaching the privacy rights of defendant Ed Bridges. However, Hamwee argued that too many police forces are still using the technology and incorrectly referencing the ruling as a legal basis for their pilots.

Read more on South Wales Police case: Police Use of Facial Recognition Ruled Unlawful in World-First Case

“While we acknowledge that the police forces have updated their policies and procedures following the Court of Appeal judgment in Bridges in 2020, this turned on a narrow point on equalities, and in any event the government should not wait for the legality of LFR deployment to be tested again in the courts,” Hamwee continued.

“We believe that, as well as a clear, and clearly understood, legal foundation, there should be a legislative framework, authorised by parliament for the regulation of the deployment of LFR technology.”

The government has in the past responded to concerns from the Lords that police use of LFR has “strong public support” and a “sound legal basis” – two assertions which Hamwee refuted.

“We note that, in contrast to the evidence we have received from the police, the Court of Appeal in the Bridges judgment expressed concerns about the ‘fundamental deficiencies’ in the current legal framework arguing that ‘too much discretion is currently left to individual police officers’ and that it is not ‘clear that there are any criteria for determining where AFR can be deployed,’” she wrote.

“We are concerned that the findings of the Bridges case were specific to that case and that the case cannot be understood as a clear basis for the use of LFR. Whatever the practice, it requires a firm foundation in primary legislation.”

The Lords Demands Government Action

The letter also:

  • Raised concerns about the way the technology is being expanded to new police forces, which seems to leave how LFR is implemented down to the discretion of local officers
  • Called for the adoption of a national compulsory LFR training program and standards for England and Wales police forces
  • Raised concerns about who approves the creation of LFR “watchlists” of suspected criminals and called for “compulsory statutory criteria and standardized training”
  • Called for national regulation or guidelines governing the assessment of “extensive crowd-screening activity”
  • Called for standardization of pre-deployment communication
  • Demanded regular local assessments of public opinion about deployment of LFR
  • Argued that greater consideration should be given to how explainability can be embedded in the LFR system
  • Said that split-second LFR assessments increase the risk of human error

What’s hot on Infosecurity Magazine?