Facial recognition systems and error rates – is this a concern?

Facial recognition systems and error rates – is this a concern?

Biometrics Institute – Thu 24 May 2018

There have been several articles (see below) recently raising concerns over the accuracy of face recognition technology and also a response from Big Brother Watch, questioning the reliability of facial recognition systems (https://bigbrotherwatch.org.uk/all-media/dangerous-and-inaccurate-police-facial-recognition-exposed-in-new-big-brother-watch-report/). The Institute has discussed this issue with its Expert Groups, noting the importance of education and information on the responsible use of biometrics. Here is what they explained to us.

The relative accuracy of a face recognition watchlist system can be understood in terms of the false identification rate (the chance of identifying the wrong person) and false non-identification rates (the chance of missing the correct person). Factors such as image quality and database size, amongst many others, all impact on the accuracy. The Big Brother Watch report does not contain sufficient data to assess the true accuracy of the system, and the figures showing inaccuracies of 98% and 91% are likely to be a misunderstanding of the statistics and are not verifiable from the data presented.

Most FRS applications will generate some level of misses and it is therefore the responsibility of the user and any governing or advisory body overseeing the introduction of such technology to conduct a proper evaluation and a privacy impact assessment prior to introducing the technology. This includes understanding and testing the accuracy of the application before and during operational use.  The system must also function in a way that is lawful, necessary and proportionate. This applies to the capturing and storing of images in the database as well as the actions taken when a potential match is made by the system. The verification of a match is a crucial process in any biometric system and this must be conducted rigorously to filter out false positives before further action is taken.

The Biometrics Institute is pro-actively working on good-practices for the use of biometrics and has developed several good-practice guides for its members. For more information please visit our website www.biometricsinstitute.org

 Some additional notes:

 * Refer to the NIST FIVE report for independent accuracy of live FR systems. 


 * FRS is a detection tool that supplements the essential police work of protecting the community

 * It’s a tool that should be operated by trained face comparers who will minimize false matches no matter the performance of the system.

Articles on the issue:

Facial recognition technology is “dangerously inaccurate”


Welsh police facial recognition has 92% fail rate, showing dangers of early AI


And http://www.wired.co.uk/article/face-recognition-police-uk-south-wales-met-notting-hill-carnival

And the UK ICO blog on this: https://iconewsblog.org.uk/2018/05/14/facial-recognition-technology-and-law-enforcement/


Lead the debate with us on the
responsible use of Biometrics