Who doesn’t have my data?
How much data do we need to collect? Who is watching us? And who is watching the police?
May 2018 – Some thought provoking questions have crossed my desk in the last few weeks which have no doubt been stimulated by recent news headlines on the failure rates of facial recognition technology (FRT) used by police and the data breach by Cambridge Analytica. The Australian Capital Territory says proposed FRT legislation exceeds agreements. I’ve also just discovered that Amazon are selling real-time facial recognition services to the police. It feels like our data is everywhere and our faces have become a sort of hot commodity and anyone’s property but our own. Where does it all stop?
I recently had a discussion with a colleague regarding privacy and the law, which posed a very important question: are authorities striking the right balance between using biometrics to counter-terrorism and increase security, whilst maintaining a citizen’s right to privacy? Is it even possible to achieve both simultaneously? Generally speaking, law enforcement agencies are not as restricted by the same privacy legislation as other organisations who acquire our personal data; but if this is the case, who is regulating them?
The UK actually has two important regulators in place to address this challenge, the Information Commissioner, Elizabeth Denham, and the Biometrics Commissioner, Paul Wiles. Elizabeth Denham posted an interesting blog on ‘Facial Recognition Technology and Law Enforcement’, indicating that she is making FRT used by law enforcement a priority area for her office. I recently met with Paul Wiles and we had a fascinating conversation about the challenges biometrics pose for collection, storage, retention and data sharing. It is interesting to note however, that FRT is outside of his remit as his focus is on fingerprints and DNA.
Scotland is also addressing potential areas of concern: a report published earlier this year by an independent advisory group called for the establishment of a code of practice to cover the acquisition, retention and disposal of biometrics such as fingerprints, DNA and photographs, as well as the creation of a new biometrics commissioner.
This discussion around the strengths and weaknesses of using biometric technology and the importance of privacy and human rights forms a critical part of the Compendium (the ‘United Nations Compendium of Recommended Practices for the Responsible Use and Sharing of Biometrics in Counter-Terrorism’) that the Biometrics Institute is working on with the United Nations Counter-Terrorism Executive Directorate (UN CTED). If we are using biometrics to counter-terrorism, then we need to acknowledge that biometric technologies are not perfect; there is always a risk that an innocent person may be matched with someone on a watchlist. And removing yourself from a watchlist you don’t belong on is not always an easy or quick task.
There is clearly also a need for a discussion around ‘oversight’ when it comes to using biometrics, not only in the private sector, but also in the public sector. Just consider that within the UN and INTERPOL, there are more than 190 countries who vary enormously in the privacy legislation they currently have in place alongside their use of biometric technologies.
I’ve also been thinking about whether or not there is a risk that without a proportionality scheme, FRT could become a ‘mass screening tool’ which poses its own set of questions around ethics and intended purpose. I’ve had several conversations with members and industry experts recently who were all asking whether someone in the industry needs to draw a line in terms of where, when and how much we use biometric technology. Currently there appears to be no limit. FRT was recently used to identify celebrities at Prince Harry’s wedding. Sounds harmless, but is some of this technology (which is not even being used to prevent crime) an unnecessary invasion of privacy?
And could the challenges around False Acceptance Rates (FAR) result in many innocent people being stopped (which could be quite intimidating or embarrassing if you are just having a nice day out with family or friends)? Grounds for suspicion based on dodgy algorithms rather that reasonable human intelligence sounds like it could be a recipe for disaster.
Arun Ross, a member of our Academic Research and Innovation Expert Group, kindly shared some of his comments on FRT and privacy which can be viewed in a brief video. He talks about the risks of Presentation Attacks and the potential for data abuse in FRT. Arun’s work on enhancing the privacy of facial templates received the Best Paper Award (Silver) at the 2018 International Conference on Biometrics in the Gold Coast, Australia. The paper can be accessed here and will be added to the Biometrics Institute Academic Papers Reading list.
On the positive side, Biometric identifiers can help to uniquely identify humans and they can contribute towards identity assurance and identifying known terrorists and criminals. They are automated processes and therefore offer a convenient solution. However, as with most security technologies, they do have vulnerabilities and the results of comparisons are probabilistic in nature, not determinative. Technology Risk (the risk that the technology will not perform as expected or fulfill the tasks desired of it) and Human Associated Risk (the risk that the users will not be able to adapt sufficiently to the new technology because of poor training or lack of support, or who don’t understand/have not experienced the benefits) can also contribute to poor results. There are mitigations available for some of the problems that will invariably occur; for example, there are a number of presentation attack detection systems available. However, it is important to remember, there simply are no silver bullets. Biometric solutions are complex; they demand capacity and technical capability from those implementing them.
The Biometrics Institute will continue to stress the importance of the responsible use of biometrics and bring together the thought-leaders in this space to debate these questions.
Our Biometrics Congress week in London (15-19 October 2018) will provide an important platform for many thought-provoking discussions and we will continue to drive the questions around privacy with a virtual meeting on ‘Biometrics and GDPR’.