How is facial recognition technology being used?

Facial recognition technology is available to everyone with a Facebook account and it is being used globally.

Raabia Fazil | 24th Oct 19
crowd walking across street to symbolise facial recognition technology scanning people

Facebook is extending facial recognition to every user. That means 360 million people able to scan their friends, family and everyone else. You’ll get a notification asking you to enable it. It’s turned off for everyone apart from people who were in the Tag Suggestions trial. Facebook say “We’ve continued to engage with privacy experts, academics, regulators and people on Facebook about how we use face recognition and the options you have to control it,” the company said. So you can manage your privacy and control if you get scanned whilst scrolling through your newsfeed.

Facial recognition has been used in Hong Kong and King’s Cross. However, the tech has its limitations. Most computer programmes are based on instructions created specifically to complete tasks but facial recognition operates differently. The tech runs through millions of people’s faces to familiarise computers with facial features and details to match likeness. A human assesses the matches and amends the code to fix errors but precision comes slowly and incorrect matches happen regularly.

> See also: Artificial Intelligence: the future of the charity sector

You are already using facial recognition tech on your smartphone as apps can select familiar faces from bright photos prominently showing facial features. It’s essentially the same tech that you use to add flower crowns to selfies. It becomes harder when surveying moving people in crowds on CCTV footage as it may only catch part of someone’s face. Matches can be made but not always accurately. The tech can mistake women with darker skin for men and can fail to accurately identify BAME individuals.

The Metropolitan police has spent over £200,000 on facial recognition technology. The London Policing Ethics Panel have examined the use of live facial technology and state that the tendency to incorrectly match people, particularly women and BAME individuals, could affect police work. They are also worried about tech affecting the freedom of individuals and repercussions they may face for being in a database. The value of tech needs to provide significant and real advantages to be reliable and ethical.

With cameras installed in public places, there have been comments about freedom. The campaign group Big Brother Watch state that the current use of facial recognition tech by the police breaches human rights. Their director Silkie Carlo says “Police have used live facial recognition lawlessly, misidentifying people over 90 per cent of the time, with many of those being wrongly stopped and subject to further police intrusion,”. With the tech available, how it is used is dependent on discretion.

> See also: How charities are innovating with digital outdoor advertising

But you don’t need to be alarmed by cameras zooming in on you. The Information Commissioner Elizabeth Denham has begun an inquiry into the use of facial recognition technology in response to worries about people being recorded in Kings Cross. The station’s developer told the Financial Times that the technology contains processes to maintain the privacy of individuals.
Denham said: “Scanning people’s faces as they lawfully go about their daily lives, in order to identify them, is a potential threat to privacy that should concern us all….As well as requiring detailed information from the relevant organisations about how the technology is used, we will also inspect the system and its operation on-site to assess whether or not it complies with data protection law”. With accountability for how facial recognition technology is being used, it can be regulated like any other process involving the public.

You could harness facial recognition technology to help your charity as, last year, the Dyslexia Association used it to raise understanding of life with dyslexia. The ICO also offered grants between £20,000 and £100,000 to promote independent research on privacy and data protection issues looking at biometric and facial recognition, age verification and artificial intelligence.