With January behind us, I thought it a good time to update you on the activity in my inbox over the last few weeks. I am very fortunate that I regularly meet and receive emails from passionate people who support the institute in its promotion of the responsible and ethical use of biometrics. These experts from around the world help align my thinking so I can propose ideas and projects to the board for execution with the aim to guide members to get biometrics right. Far from a gentle easing into the new year, we’ve been as busy as ever!
In brief, we’ve just released the details of a fantastic members-only opportunity to workshop our ground-breaking Biometrics Institute Good Practice Framework at four locations around the world, and these limited seats are filling up. You can find out how to get involved below.
Just before Christmas, I met with the UK Information Commissioner’s Office (ICO) to discuss how this exciting new good practice tool may be able to help shape guidance for the responsible use of facial recognition software. More on this shortly.
At the same time, NIST released its third report on demographic effects in facial recognition which many of you have contacted me about. I offer some commentary on that below.
There’s still so much confusion about this so-called “bias” – even within our industry – that last week we recorded the second episode of our podcast on the subject which I’m glad so many of you have downloaded. Three of our expert group members really explain the issues from all angles, the NIST report’s findings and discuss what we can collectively do to avoid a ban. Please do have a listen and share it with your colleagues or customers so we can tackle misinformation around the subject.
An important date for your diary: the always fascinating Patrick Grother, lead author of the NIST report, will be at our US Conference in Washington DC speaking about demographic effects on 24-25 March.
And the time has come for our biennial members survey – this is your chance to let us know how we’re doing so we can help you get the most out of your membership. You could win a £50 (or equivalent) Amazon voucher by taking part! Find out how below.
New Biometrics Institute good practice tool
I have noticed that not all our members are yet aware of the project we are currently working on, the Good Practice Framework. I believe this framework will be a significant milestone in the institute’s history, but most importantly it will be extremely useful for our members in their decision-making and implementation process of biometrics.
Are you always certain where to start with your biometrics journey? How do you ensure you cover every required step from policy, process to the selection of the technology? How do you decide what may not be required as part of the implementation? If I install a biometric door lock to access our office then this is a very simple implementation. However, if I am looking at border management systems that involve cross-border data exchanges, there are many complex issues to address.
We presented the first draft of the framework at the Congress in London in October and I briefed members in Canberra and Wellington in November and December respectively. We are now running a series of consultative workshops around the world. For this we need people with a range of biometrics expertise who are prepared to roll up their sleeves and get involved. This will be a hands-on experience where you will be divided into small groups who will each use a section of the framework to work through a real-life scenario. Guided by a moderator, the groups will then come back together and present their own findings to enable everyone to see the whole picture.
This is a fantastic opportunity to not only help the institute shape this tool but to experience an in-depth first look at its potential.
I have also proposed to the ICO that we run a consultative workshop in London and have approached the UK Biometrics Commissioner as well as my contacts in the Scottish government looking at establishing the office of a Scottish Biometrics Commissioner. We will also use the framework in our discussions with privacy regulators to help shape what we need to do to ensure the responsible and ethical use of biometrics and specifically facial recognition.
NIST’S face recognition vendor test
If you haven’t read the NIST report yet, I would strongly recommend anyone working in biometrics, particularly policymakers and software developers, makes themselves familiar with its contents. It contains lots of information not just on NIST’s new findings including that the majority of face recognition algorithms exhibit demographic differentials, but also really useful explanations of where the demographic effects really matter and how to potentially mitigate them.
One of our members from a financial institution asked me whether the focus of the NIST report is strictly on server-based face recognition or if these discussions also include device-based biometrics. The data used in the tests was server-side collected data such as mugshots and visa applications. And when asked whether there was a use case where financial services companies should worry more about the demographic effects, it was noted that the chances of someone falsely matching with you are higher if they are the same gender, similar age and from similar country ethnicity.
For me the most important take-away from the report is that the effects of demographic differentials are present to varying degrees in all types of face recognition systems but the better the algorithm the less the bias. It is really important to understand what challenges demographic differentials pose to different use cases and with different algorithms. They are not all the same. Know your algorithm!
We are seeing an increased number of calls to ban facial recognition and in some quarters, the technology seems to have become synonymous with surveillance. But especially in light of the NIST report, isn’t it important to first understand which use cases pose the greatest challenges and which deliver the greatest benefits to health, security and counter-terrorism?
I believe working these scenarios through our framework will help understand better where the real challenges and gaps are that need addressing. And I thank you in advance for your involvement in helping us test this tool so we can demonstrate to government and regulators what we can do to help.
United Nations compendium phase two
We’re excited to be able to tell you that the compendium for good practices in counter-terrorism we collaborated on with the United Nations is moving into phase two. The United Nations Counter-Terrorism Committee Executive Directorate (CTED) and the Office of Counter-Terrorism (OCT) have arranged a series of regional workshops to raise awareness of the compendium. These are closed workshops for member states, but I wanted to take this opportunity to remind you that if you haven’t already read it, most of the compendium contains some very sound generic information regarding biometrics systems. Although the flavour is counter-terrorism, the general principles remain the same whatever the application. In particular, general readers, would find the follow sections useful:
- Executive Summary
- An Introduction to Biometric Systems and Identity
- Governance and Regulation
- System Risk Management
- International Standards
- Procurement and Resource Management
- Reference Document Listings
Many of our long-standing Australian members will remember Geoff Poulton from the Australian government’s Commonwealth Scientific and Industrial Research Organisation. I received sad news this weekend that Geoff has passed away after a long struggle with leukaemia. Geoff was our first board director and chairman of the board and hosted our very first member meeting in Sydney. He was instrumental in helping us reach our first important milestones including the development and publication of the Biometrics Institute Privacy Code, approved by the Australian Privacy Commissioner and our subsequent growth overseas. We will forever be grateful for the time and energy Geoff gave the Biometrics Institute and our thoughts are with his wife Fran, and their family.
Calling all women in biometrics
And finally, tomorrow – 11 February – is the International Day of Women and Girls in Science. As we head into our busy events season, please look around your organisation and identify which of your female colleagues you can bring with you to further their knowledge, which you can encourage to approach us for speaker slots and which you can add to your membership. As you will see from this photo of the women who attended and spoke at our Congress in London last year, we have a good number of female supporters. However we still get criticism for a perceived lack of diversity. That science and technology has a challenge in getting more girls engaged at school, attracting them into careers and retaining them to reach senior positions is a challenge the industry is tackling. A greater diversity is crucial in biometrics as we work hard to reduce inequality and bias in the technology. We thank you for doing your bit in encouraging more women into biometrics and introducing them to the Biometrics Institute!
Are you happy with your membership?
Let us know how we’re doing so we can help you get the most out of your membership. Every two years we ask our members if they think we could do better. This is your chance to shape our future.
We will enter everyone who completes the questionnaire and gives their contact details into a draw for a £50 (or equivalent) Amazon voucher. Please help us by completing this short survey before 17 February 2020.
What’s made the news
Face recognition continues to make the news. We’re canvassing your opinions on whether there is an appetite for a more formal news digest email in the members survey you’ll receive this week, but in the meantime, I thought I’d share with you here some of the articles our expert groups have been looking at. You can find them at the end of this email, but a couple have sparked particular interest among you.
A leaked paper suggesting the European Commission was considering a temporary ban on facial recognition technology saw a flurry of activity in my inbox last month. Two weeks later, an update on the leaked document revealed the idea of a temporary ban on facial recognition technologies in public spaces had been ditched.
The new draft states that any future regulatory framework for artificial intelligence in Europe should have a foundation of trust, “The ecosystem of trust should give citizens the confidence to welcome artificial intelligence and give companies the legal certainty to innovate with artificial intelligence.”
The completed paper on Europe’s Artificial Intelligence strategy is due to be presented by the EU’s Digital tsar Margrethe Vestager on 19 February.
I have engaged with some of my key privacy contacts to make sure we are part of this ongoing discussion.
Stories like this show the importance of the institute’s role as an advocate for responsible use, a trusted source of information and engaging with key decision-makers.
This article is very concerning, especially in regards to how they obtained a, “database of more than 3 billion pictures that Clearview says it’s scraped off Facebook, Venmo, YouTube and other sites,” which included “over 641 million images of US citizens”.
Are there no privacy laws to protect us from this with or without the automated recognition aspects of the product? One of our Privacy Expert Group members explained that GDPR seems to say yes to this question. The photos that Clearview hoovered up are considered personal data and the special processing of them in order to uniquely identify an individual (ie generating templates to enable automated facial recognition) makes them sensitive personal data.
I hope you are as excited as I am for what lies ahead of us. By collaborating and sharing knowledge we can shape a future where biometrics are used responsibly and ethically for the greater good.
Some articles from our expert groups’ reading list