We had an interesting discussion in our Privacy Expert Group (PEG) which I would like to share discussing the collection of biometric data.

According to Human Rights Watch (HRW), Chinese authorities in Xinjiang collected biometric data of all of its citizens between the ages of 12 and 65 years living in the region in 2017 (read the article). DNA samples, fingerprints, iris scans and blood types were taken through a free medical examination programme, ‘Physicals for All’. It is reported that 19 million of the total population of 21 million have been sampled.

There seems to be some confusion over whether the samples were taken voluntarily; one resident felt he had no choice but to allow the tests as he did not wish to be seen as a person of concern, possibly even a ‘political dissident’. Reports suggest that officials were told to convince residents to give their consent and apply pressure if necessary.

Press articles claimed that participants had been diagnosed with illnesses they would not have known about had they not allowed the samples to be taken. However, one resident claims that he wasn’t even told the results of his tests. In addition, participants have apparently not been made aware of how their personal data will be used or told how long it will be kept for. Going forward, it seems that personal data will be collected from anyone in the region wanting a passport or to access public services such as enrolling in schools.

There is also a suggestion that biometric data could be used for surveillance of people of interest; Xinjiang is home to a vast proportion of Uyghurs and other Muslim ethnic minorities who have allegedly already had their freedoms restricted by the government.

If these claims made by Human Rights Watch are true, it could be seen as a gross violation of human rights. However, obtaining biometric data at all is not necessarily the problem, the key issue here is the difference between responsible use and abuse:

  • Has someone obtained the data rightfully or have they simply bent the rules to suit a different agenda?
  • For what specific and limited purpose were the vast array of biometrics taken?
  • Were individuals informed of the purpose and assured that the information gleaned from them would not be used for anything other than the stated purpose? 
  • Are the biometrics stored? When, where, why and for how long? 
  • Did individuals explicitly consent to onward/future use for unspecified purposes?
  • Were all the biometrics necessary for the purpose they were told they would be used for?

The critical issue is ‘informed consent’ and a clear ‘message about the purposes to which the collected biometric will be put’. If it is for stated health purposes, then that should be it. To then use it for social rewards or privileges or to identify critics of a regime or organisation would be a significant change of purpose which the Biometrics Institute does not support in its Privacy Guidelines.

I will borrow a metaphor from a colleague who articulated this much better: The car has been a great progress for all of us. Some people will never respect a stop sign or a red light. This is very bad. It does not mean that the rules on the road and driving cars are bad.

The Biometrics Institute’s mission is to promote the responsible use of biometrics and one important tool we have developed is our Privacy Guidelines with key recommendations on biometrics and privacy. This Guidelines are available to our members here and we will continue the discussion on this issue at the US Conference (20-21 March 2018) where HRW is planning to join us.

Lead the debate with us on the
responsible use of Biometrics