Sandra Leaton Gray
Sandra Leaton Gray: Biometrics in schools – adoption and privacy concerns
21st century schools can be complex, difficult places to manage and attend, not least because schools have grown in size substantially over the last couple of generations, leading to the need for multiple systems of control and regulation. One of the primary issues for governing bodies and local education authorities is reconciling a need for bureaucratic efficiency whilst acting in loco parentis – ensuring that the children in their care are where they should be, engaged in appropriate activities at the right time, and being fed at appropriate intervals. Within this space, developers have sought to support schools (and monetise solutions to any number of management problems) through the provision of any number of digital products to streamline and enhance school processes, for example for attendance monitoring, assessment and accounting/audit. It is within this commercial framework that we find biometrics proliferating, and with it, associated privacy concerns.
Biometrics started to be adopted by mainstream schools around fifteen years ago, mainly in countries such as the US and UK, the Netherlands, Belgium and France, with products aimed at monitoring expenditure on school meals, as well as access to library books, or occasionally for building access control. Early adoption figures for these functions are hard to come by, but it is estimated that by 2014, at least 40% of the UK school population had been fingerprinted for such purposes, registered for palm vein readers or registered for facial recognition systems (Darroch, 2011, Big Brother Watch, 2014). Some countries and jurisdictions banned the use of biometrics in schools early on, for example states such as Arizona, Illinois, Iowa, Maryland, Michigan and Florida, as well as countries such as Germany. In other countries and regions there were protests, for example as early as 2005, the French ‘Group Against Biometrics’ went so far as to smash palm readers in schools. Therefore the adoption of biometrics in schools has always met with controversy (Andrejevic and Selwyn, 2020).
Privacy concerns are generally two fold. Basic systems used for school meals, library book loans and room access engender concerns centred around the misuse of personal data, the likelihood of mission creep in which data collected for one purpose end up being used for another, and the dangers of mosaic identification with databases being illicitly cross-referenced to identify individuals. This is technically possible, but difficult, as so few data points are collected for fingerprint or facial recognition purposes (although this has increased in recent years). The limited data points also represent a reason why biometric systems prove so unreliable in schools, with many children’s accounts being confused, for example through meal credits/debits frequently applied to the wrong transactions, as a result of schools being poorly educated as to the important role setting the system’s False Accept/Reject Rate properly to ensure fairness, and to avoid the same 10-20 children being regularly confused (much to their chagrin).
The second form that privacy concerns take is related to higher stakes applications where a failure of a biometric system is likely to have serious personal consequences, usually of the disciplinary type. This has been greatly amplified through the introduction of behavioural biometrics to educational settings in more recent years. An example of this is the introduction of virtual proctor software for the remote monitoring of pupils sitting exams at home, which came to attention during the COVID-19 pandemic. These systems track the most minute eye movements, amongst other things, and use a proprietary algorithm to diagnose ‘cheating’. These systems, with their opaque analysis, trained on a limited population, are routinely perceived as oppressive and unfair by many examinees. Another example of high stakes biometrics is the introduction of ‘emotional’ biometrics for behaviour management purposes in the classroom, usually in an experimental capacity, as in the case of a system tested in a Chinese middle school by Hikvision Digital Technology. This latter system was designed to assess whether pupils were paying attention in class through assessing whether their facial expressions were happy, sad, angry, surprised or neutral. It brought resistance from parents and the system was quickly withdrawn. In the academic literature, the introduction of such technologies focused around the audit of the physical body has been described by Swauger (2020) as being representative of a ‘punitive pedagogy’, with power and control being at the centre of the product design, rather than, say, the growth of knowledge, or human flourishing.
In the light of recent developments, the biometrics industry now stands at something of a crossroads. It can continue to develop and test products on what is a captive population in schools, with minimal attention paid to the social consequences of their long-term use. Alternatively, it can decide to involve stakeholders much more closely in the development of products, through collaborative development approaches that involve significantly less commercial secrecy, so proper scrutiny can take place, and products can be revised and adapted as appropriate. By stakeholders, this should mean pupils, teachers and parents, rather than finance departments or senior management teams who might be involved in high-level procurement. There also need to be more extensive training populations, in order to take into account diverse cultural and racial backgrounds, as well as any special educational needs. This builds on the Biometrics Institute’s policy of appropriate use, providing for an ethical approach to technological tools which are having increasingly profound social consequences.
Andrejevic, M. and Selwyn, N. (2020) Facial recognition technology in schools: critical questions and concerns, Learning, Media and Technology, 45:2, 115-128, DOI: 10.1080/17439884.2020.1686014
Big Brother Watch (2014) Biometrics in schools: the extent of biometrics in English secondary schools and academies London: Big Brother Watch
Darroch, A. (2011) Freedom and biometrics in UK schools Biometric Technology Today 2011 (7), 5-7
Swauger, S. (2020) Our bodies encoded: algorithmic test proctoring in higher education. Chapter 6 in Stommel, J. Friend, C. and Morris, S. (Eds) Critical Digital Pedagogy (Denver, Hybrid Pedagogy Inc)
University College London Institute of Education
Sandra Leaton Gray
Member of the Privacy Expert Group, Biometrics Institute