Stay informed with free updates
Simply sign up to the Technology myFT Digest — delivered directly to your inbox.
Facial recognition technology that has been used hundreds of times a day by British police forces is biased against minority groups and women, the Home Office has acknowledged.
Official research showed that the technology identified the wrong person about 100 times more often for Asian and Black people than white people, and twice as often for women than men.
The findings have raised concerns about the government’s desire to expand the use of facial recognition technology, which policing minister Sarah Jones on Thursday called “the biggest breakthrough for catching criminals since DNA matching”.
Ministers also want to deploy related technologies “designed to identify specific motions or emotions” that would “help police spot behaviour associated with criminal activity”.
Retrospective facial recognition searches, where police match images from CCTV or mobile phones against watch lists on their national database, are now carried out more than 25,000 times a month.
However, documents published by the Home Office this week acknowledged that technology currently used “is more likely to incorrectly include some demographic groups in its search results”.
Testing at the National Physical Laboratory found the software correctly identified Asian suspects 98 per cent of the time, falling to 91 per cent for white suspects and 87 per cent for Black suspects.
However, the false positive rate, where people were wrongly flagged as matches to suspects, was far lower for white people, at 0.04 per cent. This rose to 4 per cent for Asian people and 5.5 per cent for Black people.
For men the false positive rate was 1.5 per cent, compared to 3.8 per cent for women. False positives were highest for Black women, at 9.9 per cent, but close to zero for white men.
The algorithm was also far better at correctly identifying people over the age of 41, with a 0.1 per cent false positive rate, compared to 5.2 per cent for people in their 20s.
Retrospective facial recognition is widely used by police forces across Britain and is described by the Home Office as a “common feature of police investigations”.
Live facial recognition, where camera feeds are matched against a database in real time, is currently used by 13 police forces in England and Wales, with Police Scotland considering its introduction.
The Association of Police and Crime Commissioners said the report “sheds light on a concerning inbuilt bias”, saying that “it seems clear that technology has been deployed into operational policing without adequate safeguards in place”.
It added that the lack of any clear harm resulting so far was “more by luck than design” and criticised ministers for not making the problems public sooner.
The civil rights group Liberty said the findings suggested that thousands of Black and Asian people had been wrongly flagged during the seven years in which facial recognition technology has been used in Britain.
The Home Office said that a new algorithm had been developed which it said showed no bias and would be introduced next year “subject to evaluation”.
It added that guidance had been issued to officers reminding them not to rely on facial recognition alone in making decisions about matters such as arrests.
“The Home Office takes the findings of the report seriously and we have already taken action,” a spokesperson said.
“There is human involvement in every step of the process and no further action would be taken without trained officers carefully reviewing results.”
