ICO launches investigation into facial recognition technology

The Information Commissioner’s Office (ICO) is concerned about the threat the technology poses to people’s privacy.

Joe Lepper | 16th Aug 19
Image of people in a crowd in King's Cross Station - an investigation has been launched following concerns raised in the media about the use of facial recognition technology in the Kings Cross area of central London.

The Information Commissioner Elizabeth Denham has launched an investigation into the use of facial recognition technology to scan the general public “as they go about their daily lives”.

The investigation has been launched following concerns raised in the media about the use of such technology in the Kings Cross area of central London.

> See also: How to design digital ethical principles for your charity

The Financial Times reported that facial recognition technology was being used at Kings Cross station. The station’s developer Argent told the paper that the technology included measures to protect the privacy of the public.

 

Threat to privacy

Denham said: “Scanning people’s faces as they lawfully go about their daily lives, in order to identify them, is a potential threat to privacy that should concern us all. That is especially the case if it is done without people’s knowledge or understanding.

“I remain deeply concerned about the growing use of facial recognition technology in public spaces, not only by law enforcement agencies but also increasingly by the private sector. My office and the judiciary are both independently considering the legal issues and whether the current framework has kept pace with emerging technologies and people’s expectations about how their most sensitive personal data is used.

> See also: What are ‘digital ethics’ and why should charities care?

“Facial recognition technology is a priority area for the ICO and when necessary, we will not hesitate use our investigative and enforcement powers to protect people’s legal rights.”

She added: “We have launched an investigation following concerns reported in the media regarding the use of live facial recognition in the King’s Cross area of central London, which thousands of people pass through every day.

“As well as requiring detailed information from the relevant organisations about how the technology is used, we will also inspect the system and its operation on-site to assess whether or not it complies with data protection law.

 

Legal, proportionate and justified

“Put simply, any organisations wanting to use facial recognition technology must comply with the law – and they must do so in a fair, transparent and accountable way. They must have documented how and why they believe their use of the technology is legal, proportionate and justified.

“We support keeping people safe but new technologies and new uses of sensitive personal data must always be balanced against people’s legal rights.”

Facial recognition technology is being increasingly used by charities and other organisations. Last year the Dyslexia Association used such software to better understand those with dyslexia.

> See also: Dyslexia Association digital street campaign goes live

Also last year the ICO offered grants of between £20,000 and £100,000 to promote independent and innovative research on privacy and data protection issues. This includes looking at biometric and facial recognition, age verification and artificial intelligence.

Argent added: “King’s Cross is working collaboratively with the Information Commissioner’s Office on the inquiry it has announced, and will comment further in due course.”

Want to stay on top of the latest tech news in the third sector?

Get top insights and news from our charity digital experts delivered straight to your inbox three times per week.

Subscribe