The UK’s data protection watchdog has asked the government for urgent answers after a Home Office report revealed racial bias in the retrospective facial recognition (RFR) technology used by police.
Deputy information commissioner, Emily Keaney, said in a statement on Friday that the Information Commissioner’s Office (ICO) had asked the Home Office for “urgent clarity on this matter so we can assess the situation and consider our next steps.”
The National Physical Laboratory (NPL) report, released on Thursday, tested the Cognitec FaceVACS-DBScan ID v5.5 algorithm.
RFR is used to match images captured from CCTV, mobile phone footage, dashcam/video doorbell footage or social media with those on the Police National Database.
An estimated 25,000 searches are run every month to catch potential criminals at large.
However, the report found that, “in a limited set of circumstances the algorithm is more likely to incorrectly include some demographic groups in its search results.”
Specifically, false positive rates for white subjects (0.04%) are far lower than those for Asian (4%) and black subjects (5.5%).
“The FPIR for black male subjects (0.4%) is lower than that for black female subjects (9.9%),” it added.
Keaney said the ICO acknowledges that measures are being taken by the Home Office to correct these biases.
“However, it’s disappointing that we had not previously been told about this, despite regular engagement with the Home Office and police bodies as part of our wider work to hold government and the public sector to account on how data is being used in their services,” she added.
“While we appreciate the valuable role technology can play, public confidence in its use is paramount, and any perception of bias and discrimination can exacerbate mistrust. The ICO is here to support and assist the public sector to get this right.”
Correcting Bias
The Home Office said in its report that it had purchased a new algorithm on the back of the test’s findings, adding that it “can be used at settings with no significant demographic variation in performance.”
It continued: “The new algorithm is due to be operationally tested early next year and will be subject to evaluation.”
The Association of Police and Crime Commissioners (APCC) also acknowledged that the government has introduced “mitigations” to ensure bias in the RFR system does not impact policing. However, like the ICO, it expressed concerns about transparency.
“Although there is no evidence of adverse impact in any individual case, that is more by luck than design. System failures have been known for some time, yet these were not shared with those communities affected, nor with leading sector stakeholders,” it said in a statement.
“These technologies are increasingly invasive and sophisticated. If they are to gain the trust and support of the public, particularly those groups historically mistrustful of the police, then full transparency is vital.”
The APCC said that such technologies need to be robustly and independently assessed before being deployed, and feature ongoing oversight during operations, with clear accountability to the public when things go wrong.
“We call on policing and the government to acknowledge the errors made and to work with those responsible for policing governance, locally and nationally, to ensure that scrutiny and transparency are at the heart of the police reform agenda and the forthcoming white paper. Policing cannot be left to mark its own homework,” it added.
