Key Takeaways
- Facial recognition technology used by British police forces is biased against minority groups and women
- The technology incorrectly identifies the wrong person 100 times more often for Asian and Black people than white people
- Women are twice as likely to be misidentified as men
- The false positive rate for Black women is 9.9%, compared to near zero for white men
- The government plans to expand the use of facial recognition technology despite concerns about its accuracy and bias
Introduction to Facial Recognition Technology
Facial recognition technology has been widely used by British police forces, with over 25,000 retrospective facial recognition searches carried out every month. The technology has been hailed as a breakthrough in catching criminals, with policing minister Sarah Jones comparing it to DNA matching. However, a recent report by the Home Office has revealed that the technology is biased against minority groups and women, raising concerns about its accuracy and potential for misuse.
Bias in Facial Recognition Technology
The report found that the technology used by police forces is more likely to incorrectly identify people from certain demographic groups, including Asian and Black people, and women. The false positive rate, where people are wrongly flagged as matches to suspects, is significantly higher for these groups. For example, the false positive rate for Black women is 9.9%, compared to near zero for white men. This means that thousands of Black and Asian people may have been wrongly flagged during the seven years in which facial recognition technology has been used in Britain.
Concerns about the Use of Facial Recognition Technology
The findings of the report have raised concerns about the government’s plans to expand the use of facial recognition technology. The Association of Police and Crime Commissioners has criticized the lack of safeguards in place, saying that the technology has been deployed into operational policing without adequate consideration of its potential biases. The civil rights group Liberty has also expressed concerns, suggesting that thousands of Black and Asian people may have been wrongly flagged during the use of facial recognition technology.
Response to the Report
The Home Office has acknowledged the findings of the report and has said that it takes the issue seriously. A new algorithm has been developed which is said to show no bias and will be introduced next year, subject to evaluation. The Home Office has also issued guidance to officers reminding them not to rely on facial recognition alone in making decisions about matters such as arrests. However, critics argue that more needs to be done to address the issue of bias in facial recognition technology and to ensure that its use is transparent and accountable.
The Future of Facial Recognition Technology
The use of facial recognition technology is likely to continue to be a contentious issue in the coming years. While it has the potential to be a powerful tool in catching criminals, its accuracy and bias must be carefully considered. The government must ensure that the technology is used in a way that is fair, transparent, and accountable, and that safeguards are put in place to prevent its misuse. This may involve further research into the biases of facial recognition technology, as well as the development of new algorithms and guidelines for its use.
Conclusion
The report by the Home Office has highlighted the need for caution and careful consideration in the use of facial recognition technology. While it has the potential to be a powerful tool in catching criminals, its accuracy and bias must be carefully evaluated. The government must ensure that the technology is used in a way that is fair, transparent, and accountable, and that safeguards are put in place to prevent its misuse. By doing so, we can ensure that facial recognition technology is used to promote justice and equality, rather than perpetuating existing biases and inequalities.