Key Takeaways:
- The UK Home Office has admitted that facial recognition technology is more likely to incorrectly identify black and Asian people than their white counterparts.
- The technology has been found to have a higher false positive identification rate for black and Asian subjects, particularly black women.
- The Association of Police and Crime Commissioners has raised concerns about the inbuilt bias in the technology and the lack of adequate safeguards.
- The government has announced a 10-week public consultation on the use of facial recognition technology, including its potential expansion to access other databases.
- Civil liberties groups and politicians are calling for stronger safeguards and more transparency in the use of facial recognition technology.
Introduction to Facial Recognition Technology
Facial recognition technology has been hailed as a breakthrough in law enforcement, allowing police to quickly and easily identify suspects and track down criminals. However, a recent report by the National Physical Laboratory (NPL) has raised concerns about the technology’s accuracy and potential bias. The report found that the technology is more likely to incorrectly identify black and Asian people than their white counterparts, particularly in certain settings. This has led to calls for stronger safeguards and more transparency in the use of facial recognition technology.
The NPL Report
The NPL report found that the false positive identification rate (FPIR) for white subjects was significantly lower than that for Asian and black subjects. The FPIR for white subjects was 0.04%, compared to 4.0% for Asian subjects and 5.5% for black subjects. The report also found that the number of false positives for black women was particularly high, with a FPIR of 9.9%. These findings have raised concerns about the potential for the technology to be used in a discriminatory manner.
Concerns About Bias and Safeguards
The Association of Police and Crime Commissioners has expressed concerns about the inbuilt bias in the technology and the lack of adequate safeguards. The association has questioned why the findings of the NPL report were not released earlier and why they were not shared with black and Asian communities. The association has also called for more transparency and oversight in the use of facial recognition technology. Civil liberties groups, such as Liberty, have also raised concerns about the potential for the technology to be used to discriminate against certain groups.
Government Response
The government has announced a 10-week public consultation on the use of facial recognition technology, including its potential expansion to access other databases. The consultation will ask the public whether police should be able to access other databases, such as passport and driving licence images, to track down criminals. The government has also said that it takes the findings of the NPL report seriously and has already taken action to address the issues raised. A new algorithm has been independently tested and procured, which has no statistically significant bias. The algorithm will be tested early next year and will be subject to evaluation.
Calls for Stronger Safeguards
Despite the government’s response, many are calling for stronger safeguards and more transparency in the use of facial recognition technology. The former cabinet minister David Davis has raised concerns about the potential for the technology to be used to create a "big brother" state. Civil liberties groups are also calling for more robust safeguards to be put in place to protect the rights of individuals. The Home Office has said that there are manual safeguards in place, including the requirement for all potential matches to be visually assessed by a trained user and investigating officer. However, many argue that these safeguards are not enough and that more needs to be done to address the potential bias in the technology.
Conclusion
The use of facial recognition technology is a complex and contentious issue. While it has the potential to be a powerful tool in law enforcement, it also raises concerns about bias and discrimination. The findings of the NPL report have highlighted the need for stronger safeguards and more transparency in the use of facial recognition technology. The government’s response to these concerns will be crucial in determining the future of this technology and ensuring that it is used in a way that respects the rights of all individuals.