Blog: Why you can’t rely on an internet search for facts

Every week I have multiple conversations with members about misinformation around biometric technologies. Recently I attended a conference where one of the speakers quoted numerous news headlines that had taken research out of context, therefore presenting misleading information around biometric performance and accuracy. Shouldn’t we all be aware by now that you cannot rely on an internet search to get your facts? As this year’s annual industry survey revealed, it’s not only the media who would benefit from a greater understanding of the technology. A lack of knowledge among decision makers was highlighted by our members as a factor restraining the market. We know many of you are tackling these things day to day and coming up with ways to better educate, set the record straight and get the message out that you’re doing the right thing.

In the last six months, we’ve been discussing with our members how the institute can best clarify what biometric technologies can and can’t do, that there is a difference between facial detection and facial recognition, what false rejects and false accepts mean, how to read error rates and what the difference between verification and identification is. We’ve also set up a new group – our Future Direction Group – which has this project on its slate. The institute is unique as it represents a large variety of stakeholders – 240 member organisations from government agencies around the world, police forces, airports and payment services as well as leading suppliers and academics. We have the experts to guide us to provide a balanced viewpoint, but it takes time to bring this together.

Facial recognition technology has undergone an “industrial revolution” according to the US National Institute of Standards and Technology (NIST), with performance 20 times better than four years previously. It is so much more powerful and can do things we didn’t quite expect. But how do I feel about my local café introducing biometrics to recognise me as I enter and presenting me with my favourite coffee as I reach the counter? How do I feel about my friends’ children paying for their lunch with just a scan of their finger? And in the last year we have also seen some use cases that are really pushing at the limits – or perhaps going beyond them – of what the technology can be used for, which makes many of us feel uncomfortable and fearful of what might come next. As a biometrics community we have to ask, just because we can, should we?

Our 2001 notice in The Australian

Our 2001 notice in The Australian

It’s 18 years ago this month that the Biometrics Institute was founded, and I’ve been with it for over 17 of those. Established to promote the responsible and ethical use of biometrics, our founder Ted Dunstone commented last week, “We are now at a critical turning point where the institute’s voice is vital to the discussion around responsible use,” and he’s not wrong.

We know that good practice is paramount and this year so far we’ve released our Ethical Principles for Biometrics, Updated Privacy Guidelines and a new guiding document to provide clarification around presentation attack detection (PAD) and liveness. At the end of this month, we’re launching an exciting new framework which outlines the stages of the strategic planning, procurement and operation of a biometric system or network and will provide a pathway through the factors that may influence a biometric application. More on that shortly.

Our packed meeting in Singapore

Our packed meeting in Singapore

This year we’ve facilitated multiple open, inclusive, off-the-record discussions among our members all over the world. This month, we returned to Singapore for the first time since 2012, and following its success, will be back to continue the conversation with more of you in May next year. Fifty key government and aviation stakeholders packed the meeting we held to find out more about good practice perspectives in Asia. We impose Chatham House Rule on all our meetings – you can say what was discussed outside the room but not directly quote who said what. Several of the delegates told me how surprised they were that the meeting was so incredibly open and at the range of viewpoints which were aired. Those comments made me realise how important an open and inclusive conversation is.

A diverse range of views were aired at our Washington DC meeting hosted by the Department of Homeland Security

Our September Washington DC meeting came shortly after the banning of facial recognition technology in San Francisco, the theft of traveller photos and license plate images from a US Customs and Border Protection subcontractor and the letter twenty US House Democrats wrote to the Department of Homeland Security, asking for clarification of the laws granting them the authority to use facial recognition on American citizens. Our US members were feeling that public trust in what they were doing had been challenged. One of our privacy advocates gave a great presentation on what regulation of biometrics technology might look like which sparked a lively discussion. It boiled down to the question, what exactly do campaigners want to ban? Not facial recognition technology itself, but the bad use cases. Transparency is fundamental to improving public trust. If you can’t, or don’t feel comfortable talking about what you’re doing, you probably shouldn’t be doing it. This meeting made me realise how important a collaborative approach is, where diverse stakeholders hear and respect each other’s views and seek compromise.

Interestingly, in the midst of this, the US-based Pew Research Center released a survey of public opinion, suggesting that the majority (56%) of US adults trust law enforcement agencies to use facial recognition responsibly. At the same time on the other side of the pond, a national survey in the UK by the Ada Lovelace Institute showed that although Brits accept the use of facial recognition technology when public benefit is clear, 55% want to see restrictions on its use and almost a third were uncomfortable with the police using facial recognition under any circumstances.

It seems certain that regulation and legislation will be progressively more important for the future development of this technology, and be necessary to further reassure the public. The institute clearly has an important role in helping shape the thinking around what is really needed.

So now we’re counting down the days and weeks to the culmination of our 2019 events – our London Congress at the end of this month, and Showcase Australia in November. We’ll be distilling everything we’ve discussed over the last few months at the Congress into 70 thoughtfully-chosen expert speakers, six insightful interactive panel discussions and a 20-strong supplier exhibition with specialists on hand to demonstrate what they’re working on. At Showcase Australia we’ll be continuing the important discussion. There we’ll have over 20 speakers and 100 delegates talking about good practices, assessing what Australians think of biometrics for identification, its role in law enforcement and the public perception of the technology.

We’re expecting more than 300 people at this year’s Congress which will be our incredible 426th event since our first small meeting in Sydney in 2001. Delegates are converging on London from all over the world, Australia, Canada, Denmark, Finland, France, Estonia, Germany, Mexico, the Netherlands, Norway, Russia, Singapore, Spain and the USA. Speakers include Paul Wiles, the UK’s biometrics commissioner, Bob Schukai, senior vice president at Mastercard, John Boyd, assistant director at the US Office of Biometrics Identity Management for the Department of Homeland Security and John Frank, vice president at Microsoft – one of our newest members.

What is clear to me is that an informed, open, inclusive approach, where all views are heard and respected is essential in developing good practices. That’s why at the final session of Congress on 30 October, we’ll be presenting a potentially groundbreaking new tool to our members and asking for their feedback. The aim of the Good Practice Framework is to create a common understanding of biometric systems, the associated terminology and concepts. It will act as a guide through the process of establishing a biometric system, flagging potential vulnerabilities and considering R&D, emerging technologies and governance options.

So many of our members are speaking, moderating, sponsoring and attending Congress and Showcase because they know our events are a real opportunity to help shape the future of biometrics. I cannot wait to be there to facilitate this important dialogue.

Let’s work together to make the world a better and safer place through the responsible and ethical use of this technology.

Isabelle Moeller

Chief Executive

Lead the debate with us on the
responsible use of Biometrics