This project by the International Network of Civil Liberties Organizations (INCLO) focuses on the use of facial recognition technology (FRT) by police.

Our project builds on the 2021 INCLO report In Focus: Facial Recognition Stories and Rights Harms From Around the World.1In Focus: Facial Recognition Tech Stories And Rights Harms From Around The World, International Network of Civil Liberties Organizations, 2021,https://inclo.net/publications/in-focus-facial-recognition-tech-stories-and-rights-harms-from-around-the-world/. That report was a compilation of stories demonstrating the then-emerging harmful effects of FRT that can be used to map, analyse and attempt to establish the identity of a face in a photograph or video, thereby giving users the capability to track or surveil a person in real-time or retrospectively without their knowledge or consent.2Akbari, A, “Facial Recognition Technologies 101: Technical Insights” in The Cambridge Handbook of Facial Recognition in the Modern State, Cambridge University Press, 2024, pp. 29–43, https://www.cambridge.org/core/books/cambridge-handbook-of-facial-recognition-in-the-modern-state/facial-recognition-technologies-101/8B3039F97B11F43B78E52BBEB73E8479. It outlined how FRT, a powerful but flawed technology, impacts citizens’ rights and daily lives across 13 countries in the Americas, Africa, Europe, Asia and Australia. It looked at how FRT poses risks by enabling surveillance that can track individuals during protests, religious events, medical visits and everyday activities, and how FRT can misidentify people – especially people of colour – for crimes they did not commit. Our report made a strong case for an open, public, democratic debate about the use of this technology.

Three years later, the use of this transformative technology is increasingly normalized and ubiquitous. The current US$5 billion FRT industry is estimated to grow to US$50 billion by 2030.3Matulionyte R & Zalnieriute M, “Facial Recognition Technology in Context: Technical and Legal Challenges” in The Cambridge Handbook of Facial Recognition in the Modern State, Cambridge University Press, 2024, https://www.cambridge.org/core/books/cambridge-handbook-of-facial-recognition-in-the-modern-state/facial-recognition-technology-in-context/A4F5E2C52EF9CFD27E8F04D0DD60074D. A growing number of state and private actors4Lyons, J, NFL to begin using face scanning tech across all of its stadiums, The Register, 2 August 2024, https://www.theregister.com/2024/08/06/nfl_face_scanning_tech/. See also Baker, T, Home Office eyeing expansion of “Orwellian” facial recognition, Sky News, 30 August 2023, https://news.sky.com/story/facial-recognition-technology-labelled-orwellian-as-government-eyes-wider-use-by-police-and-security-agencies-12950942. across the globe are moving to introduce5Desmarais, A, Ireland’s new police facial recognition bill has “fundamental defects,” experts say, Euronews, 1 March 2024, https://www.euronews.com/next/2024/03/01/irish-police-facial-recognition-bill-has-fundamental-defects-experts-say#:~:text=The%20Irish%20Parliament%20passed%20the,to%20this%20law%20in%20December. or expand6 Sabbagh, D, “Starmer’s live facial recognition plan would usher in national ID, campaigners say”, The Guardian, 2 August 2024, https://www.theguardian.com/technology/article/2024/aug/02/starmer-live-facial-recognition-plan-would-usher-in-national-id-campaigners-warn. the use of FRT. Legislators are passing laws for FRT use with inadequate guardrails for fundamental rights,7Volpicelli, G, EU set to allow draconian use of facial recognition tech, say lawmakers, Politico, 16 January 2024, https://www.politico.eu/article/eu-ai-facial-recognition-tech-act-late-tweaks-attack-civil-rights-key-lawmaker-hahn-warns/. and courts are increasingly tasked with understanding and adjudicating on the risks.8Glukhin v Russia, App. No(s),11519/20,https://hudoc.echr.coe.int/eng?i=001-225655; see also New Jersey Appellate Division One of First Courts in Country to Rule on Constitutional Rights Related to FRTs, ACLU, June 2023, https://www.aclu-nj.org/en/press-releases/new-jersey-appellate-division-one-first-courts-country-rule-constitutional-rights. This is the context within which we return to this subject with urgency.

In Eyes on the Watchers: Challenging the Rise of Police Facial Recognition, we repeat our call for a thorough re-evaluation and reconsideration of FRT’s application in law enforcement. This call is underscored by FRT’s potential misuse, the growing interconnectedness of state surveillance systems and its ongoing impact on individual freedoms. Given the growing deployment of FRT across INCLO member states, we have returned to this subject and developed a set of principles grounded in our documented explanation of the technology, its applications, and its harms and risks, together with human rights standards and legal analysis.

Our principles are focused on the use of FRT by police for the purpose of identification; they provide a foundation for understanding the risks of FRT and serve as a tool for assessment and advocacy. We believe they are valuable to policy makers, civil society, legislators, the public, media, courts and law enforcement.

The risks of FRT in a policing context cannot currently be safeguarded by legislation and the technology cannot be safely deployed; therefore, police should be banned from using FRT. Our principles do not promote the use of policing FRT, but rather map existing minimum accountability and harm-mitigation standards. They serve as a tool to build consensus around the significant problems posed by FRT and the need for significant restrictions and bans.

We advocate for adopting even higher standards tailored to the specific circumstances of each jurisdiction to ensure the protection of human rights and the integrity of law enforcement practices.

Endnotes