ICO Warns of "Immature" Biometric Tech

Written by

The UK’s data protection regulator has warned organizations using or developing “emotion analysis” technology to act responsibly or risk facing a formal investigation.

The Information Commissioner’s Office (ICO) issued the unusual statement yesterday, claiming that immature algorithms unable to detect emotional cues accurately enough could raise the risk of systemic bias, inaccuracy and discrimination, while presenting data protection challenges.

Emotional analysis tech can monitor a user’s gaze, sentiment, facial movements, gait, heartbeat, facial expression and even skin moisture to achieve various ends such as health monitoring at work or registering students for exams, the ICO said.

As such, it’s even riskier than biometric data processing for identity verification, the regulator warned.

Deputy commissioner, Stephen Bonner, said the biometrics and emotion AI market may never reach maturity and, in the meantime, presents data protection risks.

“While there are opportunities present, the risks are currently greater. At the ICO, we are concerned that incorrect analysis of data could result in assumptions and judgements about a person that are inaccurate and lead to discrimination,” he argued.

“The only sustainable biometric deployments will be those that are fully functional, accountable and backed by science. As it stands, we are yet to see any emotion AI technology develop in a way that satisfies data protection requirements, and have more general questions about proportionality, fairness and transparency in this area.”

The regulator said it would continue to engage with the market and explain the need to build security and data protection into products “by design.”

Its latest warning comes ahead of new guidance on biometric technologies set to be published in spring 2023, which will include advice on more common tools such as facial, fingerprint and voice recognition.

What’s hot on Infosecurity Magazine?