Earlier this month, a leaked document was published by The Intercept outlining the worries of US agencies that face masks, now commonplace due to the COVID-19 pandemic, could ‘break’ facial recognition. The document, released as part of the BlueLeaks hack, emphasises the possibility of ‘violent adversaries’ using protective masks to evade biometric identification algorithms. To this day, it remains massively unclear, due to the regular emergence of new, conflicting survey results, whether the general public even supports this initiative, with commentators from both sides of the debate now raising concerns about how the wearing of masks could affect the accuracy of the technology.
This is not the first time I have come across the subject of facial recognition technology. At the close of 2019, I was invited to the ‘Where Next for Policing and Criminal Justice’ annual Longford Lecture, which took place in Westminster, London. This lecture was given by Ian Blair, Commissioner of the Metropolitan Police until 2008 and a Member of the House of Lords. During the lecture, Lord Blair expressed concerns about a ‘tattered’ justice system. He spoke about police use of body cameras and tasers, but, somewhat surprisingly, neglected to mention the emerging matter of the use of facial recognition technology.

Biometric identification, more commonly known as facial recognition, was developed in the 1960s by Panoramic Research, an American company allegedly involved in the Central Intelligence Agency’s ‘LSD mind-control’ project. The technology scans the faces of citizens, creating unique biometric maps, then compares the result to images of suspected criminals. The software analyses the distinguishable landmarks on a person’s face, in particular, pupillary distance and cheekbone shape. In January this year, the Metropolitan Police officially deployed the technology after years of trials in London and South Wales. Facial recognition would be deployed in locations ‘where serious offenders would most likely be located.’ The Metropolitan Police have continuously attempted to defend their use of this surveillance technology, with high-profile experiments taking place at Westfield Shopping Centre, Stratford, and at Notting Hill Carnival, one of London’s most ethnically diverse events. It remains massively unclear, due to the regular emergence of new, conflicting survey results, whether the general public supports this initiative. To explore contrasting positions on the matter, I spoke with three informed individuals: Ch. Supt. Paul Griffiths, the President of the Superintendents’ Association; Richard Lewis, a recently retired recipient of the Queen’s Police Medal; and Big Brother Watch, a civil liberties campaign group chaired by Lord Strasburger of Langridge.
‘The public needs to be kept safe,’ Ch. Supt. Griffiths said to me firmly. ‘And that is achieved through the use of CCTV, ANPR, speed cameras, and other surveillance technologies.’ The interview continued with me mentioning that, according to Duke University, effective facial recognition technology can prevent false arrests by quickly and accurately identifying faces. ‘It certainly isn’t the only method we rely on,’ I was told, ‘And we need to be satisfied that the use of any data can support the police in their goals.’ Early detection of wanted individuals allows police officers to scramble resources to secure themselves and the public, possibly saving lives. Police officers can, therefore, spend their time maintaining order on the streets instead of searching aimlessly for wanted suspects. It was explained to me that developments in technology should be embraced by police forces, but only where its use is necessary and proportionate. Police will operate within scrutiny, accountability and oversight when using personal data, it was emphasised by Ch. Supt. Griffiths.
The recently retired Deputy Chief Constable of South Wales Police, Richard Lewis QPM, had a similar view to that of the PSA President. ‘Facial recognition can be a powerful technology for crime detection and prevention,’ he told me, but added, ‘When used appropriately.’ At the same time, Mr Lewis sent me some material to support his opinion, including a ‘factsheet’ produced by South Wales Police. According to the factsheet sent to me by the former executive officer, there is no evidence to suggest gender or racial bias is present when the technology is used. In 2019, facial recognition technology resulted in twenty-two arrests and disposals at Welsh music and sporting events.
According to a study by Monash University, Australia, police in the United Kingdom are using the technology in a ‘legal vacuum’, with it being described as ‘particularly intrusive’. Academics from across the Commonwealth have raised concerns about police use of facial recognition, with a recent Northumbria University press release declaring there is ‘an urgent need for reflection on the potential social harms that emerge from the use of live facial recognition.’ I spoke briefly with a representative for Big Brother Watch, who claim the Metropolitan Police is the largest police force outside of China to use the ‘authoritarian mass surveillance tool’ that is facial recognition technology. Public spaces are being turned into biometric surveillance zones, the representative added, without any clear legal basis or authority, and it can be used in a biased manner, targeting people of a certain ethnicity or demographic.
Opponents of the technology suggest there is a risk to citizens’ privacy, and mention that those accused of crimes still have rights. In a free world, individuals are supposedly allowed a choice when it comes to matters of consent, however, permission can be simply non-existent when it comes to facial recognition. On the other hand, the Law Society in Britain has observed that facial recognition has already been used to staggeringly positive effect in India, where 3,000 missing children were located in just four days after photos were provided by parents. In England and Wales, the police’s technology is still in a developmental stage, with leading experts from three universities currently working with the Home Office to address poor recognition accuracy levels. It is estimated that by 2024, the global facial recognition market will generate £5.5 billion of revenue.