Real and potential impact on human rights
To hinge the decision of whether a police force should use an FRT system on its purported “accuracy” is, wilfully or unwittingly, to misunderstand the contextual complexity of this biometric technology and its potential far-reaching implications for fundamental human rights.
Even if all policing FRT systems were accurate 100 percent of the time, the risks for people’s fundamental human rights would multiply significantly. FRT systems risk stripping people of their anonymity, reducing them to walking licence plates1Guidelines 05/2022 on the use of facial recognition technology in the area of law enforcement, European Data Protection Board, adopted 26 April 2023, p.15, https://edpb.europa.eu/system/files/2023-05/edpb_guidelines_202304_frtlawenforcement_v2_en.pdf. and tilting the power dynamic inherent in police–civilian interactions further towards police.
To ensure this powerful surveillance technology is not misused, abused or normalized, the entire lifetime of an FRT system, its connection to other surveillance systems, the use, storage and destruction of facial biometric identifiers and the technical and organizational safeguards in place (or not) to protect those identifiers all have to be considered when one contemplates the potential human rights risks associated with FRT.
In the interests of democracy and the right to a fair trial, we must know which surveillance tools may lead to us being arrested or accused. As such, consideration must also be given to mechanisms for transparency and oversight of each component of FRT and of each step in the use of FRT by police, the independence and efficacy – or lack thereof – of such mechanisms and the question of how to hold accountable the ever-changing policing FRT systems and the developers, manufacturers and users of those systems.
Fundamental rights at risk with police use of FRT
The following rights are engaged by police use of FRT. The level of impact on these rights depends, like every individual use case of an FRT system by police (whether retrospective, live or operator-initiated) on many factors. These include the architecture of the system – including its subsystems and the algorithms at the heart of it, the dataset the algorithms have been trained on and the purpose of the FRT use – whom the technology is used against and the consequences of its use.
None of the following fundamental rights are absolute and it is acknowledged that states may interfere with fundamental rights in the pursuit of legitimate public interest objectives, provided the interferences are proportionate, are limited to what is necessary in a democratic society and are the least intrusive methods. A balance must be struck between ensuring that a state has effective and legitimate tools at its disposal in order to fulfil its functions and the protection of fundamental rights and freedoms. It should also be noted that when we consider the use of FRT by police, different jurisdictions respect and uphold the following rights differently and to differing degrees, giving further context to the human rights implications of FRT use.
i. Right to dignity
As stated by the Universal Declaration of Human Rights, all humans are born free and equal in dignity and rights.2Article 1, Universal Declaration of Human Rights, https://www.un.org/sites/un2.un.org/files/2021/03/udhr.pdf. By virtue of being human, all people deserve respect.
A person’s facial biometric data is permanently and irrevocably linked to their identity. The processing of biometric data under all circumstances constitutes a serious interference in itself with several rights, including privacy, regardless of the outcome of the identification attempt (incorrect or correct).3Guidelines 05/2022 on the use of facial recognition technology in the area of law enforcement, European Data Protection Board, adopted 26 April 2023, p.5, https://edpb.europa.eu/system/files/2023-05/edpb_guidelines_202304_frtlawenforcement_v2_en.pdf. This intrusiveness is one of the reasons a person’s biometric data is given extra legal protection in certain states.4Article 4(14) of the EU General Data Protection Regulation defines “biometric data” as personal data resulting from specific technical processing relating to the physical, physiological or behavioural characteristics of a natural person, which allow or confirm the unique identification of that natural person, such as facial images or dactyloscopic data. Under Article 9 of the GDPR the processing of biometric data is prohibited, save for certain circumstances. https://eur-lex.europa.eu/eli/reg/2016/679/oj. Also see Biometric Information Privacy Act, Illinois State Legislature, 2008, https://www.ilga.gov/legislation/ilcs/ilcs3.asp?ActID=3004&ChapterID=57.
This serious interference is linked with the right to dignity, to be valued, respected and treated ethically and not as a commodity.5Guidelines 05/2022 on the use of facial recognition technology in the area of law enforcement, European Data Protection Board, adopted 26 April 2023, p.15, https://edpb.europa.eu/system/files/2023-05/edpb_guidelines_202304_frtlawenforcement_v2_en.pdf. Should a person feel they are under surveillance – constant or otherwise – as a consequence of FRT, they may change their behaviour in order to avoid locations, social scenarios or cultural events where FRT is deployed, thereby severely impacting their ability to live a dignified life.6Facial recognition technology: fundamental rights considerations in the context of law enforcement, European Union Agency for Fundamental Rights, 2020, p.20, https://fra.europa.eu/sites/default/files/fra_uploads/fra-2019-facial-recognition-technology-focus-paper-1_en.pdf.
As the European Data Protection Board warns, “Human dignity requires that individuals are not treated as mere objects. FRT calculates existential and highly personal characteristics, the facial features, into a machine-readable form with the purpose of using it as a human license plate or ID card, thereby objectifying the face.”
ii Right to privacy
The right to privacy is an expression of human dignity and is linked to the protection of autonomy and personal identity.7A/HRC/55/46: Legal safeguards for personal data protection and privacy in the digital age, Office of the High Commissioner of Human Rights, 18 January 2024, https://www.ohchr.org/en/documents/thematic-reports/ahrc5546-legal-safeguards-personal-data-protection-and-privacy-digital. It includes a reasonable expectation of privacy while in public and is recognized as a “gateway” right, given it enables the realization of other rights.
Should a policing FRT system enable members of the public to be identified in public spaces, and/or their movements, interests and associations to be tracked, either in real time or in retrospect, they are at risk of losing not only their privacy rights but also the associated rights built upon privacy. These include the right to protest, to freely associate with others and to express one’s sexuality, religious belief and/or political affiliation.
The manner in which FRT engages the right to privacy can be exacerbated when the FRT system is used live from a distance or in retrospect, without the person’s consent, active involvement or knowledge. This is a point of critical importance when we consider the use of FRT by police, as some uses of FRT could amount to covert and/or sustained mass surveillance. In addition, the fact that FRT watchlists and reference databases, and the scanning of multiple people in real time or retrospect, unavoidably involve the processing of facial data belonging to people who have nothing to do with certain crimes, but potentially remain in a virtual line-up,8Garvie, C, Bedoya, A & Frankle, J, The Perpetual Line-Up: Unregulated Police Face Recognition in America, Georgetown Law Center on Privacy and Technology, October 2016, https://www.perpetuallineup.org/risk-framework. underscores how FRT seriously interferes with people’s right to privacy.
iii Rights to freedom of expression, freedom of peaceful assembly and association
The rights to freedom of opinion and expression are indispensable conditions for the full development of the person, provide for the exchange and development of opinions in society and, together, constitute the foundation stone of every free and democratic society.9UN Human Rights Committee, General comment No.34, CCPR/C/GC/34, para.2., https://documents.un.org/doc/undoc/gen/g11/453/31/pdf/g1145331.pdf. The fundamental human right of peaceful assembly and association enables and protects individuals’ ability to express themselves individually and collectively.10UN Human Rights Committee, General comment No.37, CCPR/C/GC/37, para.1, https://documents.un.org/doc/undoc/gen/g20/232/15/pdf/g2023215.pdf.
If a police force uses FRT, in a live, operator-initiated or retrospective manner, to monitor and/or seek to identify people who are freely gathering, attending a protest in a public space or congregating in a place of worship, the technology could potentially reveal the political leanings of individuals and/or their religious beliefs. Even if police were seeking to find a specific individual at a protest whom they have included on a watchlist via a legal mechanism, some uses of FRT could result in every person attending the demonstration – the majority of whom would be of no interest to police – having their biometric data processed, and possibly stored, in real time or in retrospect without their knowledge, active involvement or consent.
The mere knowledge that police are using FRT in such a way severely affects people’s reasonable expectation of being anonymous in a public space, and could result in a chilling effect on citizens’ ability or decision to gather, express their opinions, freely exchange information and engage in behaviour that is necessary and vital for a healthy democracy, thereby impairing political participation.11Murray, D et al., “The Chilling Effect of Surveillance and Human Rights: Insights from Qualitative Research in Uganda and Zimbabwe”, Journal of Human Rights Practice, Volume 16, Issue 1, February 2024, pp. 397–412, https://doi.org/10.1093/jhuman/huad020. Experts have warned that the long-term chilling effects of FRT on democratic societies have not been fully examined by the courts or the police.12Murray, D, “Police Use of Retrospective Facial Recognition Technology: A Step Change in Surveillance Capability Necessitating an Evolution of the Human Rights Law Framework”, Modern Law Review, December 2023, https://doi.org/10.1111/1468-2230.12862.

iv Right to protection of personal data
Everyone has the right to the protection of their personal data. Police use of intrusive technologies, such as FRT, can pose significant risks to data protection rights, as it involves processing sensitive personal data and can lead to discriminatory and biased outcomes for individuals. It also raises questions concerning consent.
Just because a person is aware they have been photographed or recorded by CCTV in a public space, this does not mean that they have agreed to make their biometric data public and/or consented to this data being extracted from an image, processed to create a biometric template and stored or used for identification purposes by police in real time or at some point in the future. Different states have varied, and in some cases no, legal safeguards for the retrieval of biometric data and the use, retention and/or destruction of the same.
Depending on the use case of FRT, its interference with the right to protection of personal data would be heightened considerably if a person is subjected to any manner of “profiling” or automated processing. Such processing might see a person’s biometric facial data used to evaluate certain of their personal aspects and/or to analyse or erroneously predict aspects concerning their performance at work, economic situation, health, personal preferences, interests, reliability, behaviour, location or movements.
v Right to equality and non-discrimination
Everyone is equal before the law and entitled without any discrimination to equal protection of the law.13Article 7, Universal Declaration of Human Rights, https://www.un.org/sites/un2.un.org/files/2021/03/udhr.pdf.
When used to try to identify a person, different policing uses of FRT systems composed of different algorithms and trained on different datasets amid differing conditions can result in different error rates. But while error rates will vary depending on the multiple factors which can affect the performance of an FRT system, these errors do not affect all individuals equally.
Studies on FRTs have clearly demonstrated racial and gender biases,14Buolamwini, J & Gebru, T, “Gender Shades: Intersectional Accuracy Disparities in Commercial Gender Classification”, Proceedings of the 1st Conference on Fairness, Accountability and Transparency, 2018, http://proceedings.mlr.press/v81/buolamwini18a/buolamwini18a.pdf. See also Buolamwini, J, Response: Racial and Gender bias in Amazon Rekognition – Commercial AI System for Analyzing Faces, Medium, 25 January 2019, https://medium.com/@Joy.Buolamwini/response-racial-and-gender-bias-in-amazon-rekognition-commercial-ai-system-for-analyzing-faces-a289222eeced.See also Deborah Raji, I & Buolamwini, J, “Actionable Auditing: Investigating the Impact of Publicly Naming Biased Performance Results of Commercial AI Products”, Proceedings of the 2019 AAAI/ACM Conference on AI, Ethics, and Society, https://dl.acm.org/doi/10.1145/3306618.3314244. See also Cook, C, Howard, J, Sirotin, Y, Tipton, J & Vemury, A, “Demographic Effects in Facial Recognition and Their Dependence on Image Acquisition: An Evaluation of Eleven Commercial Systems”, IEEE Transactions on Biometrics, Behavior, and Identity Science, 2019, https://ieeexplore.ieee.org/document/8636231. See also NIST Study Evaluates Effects of Race, Age, Sex on Face Recognition Software, 19 December 2019. NIST wrote: “How accurately do face recognition software tools identify people of varied sex, age and racial background? According to a new study by the National Institute of Standards and Technology (NIST), the answer depends on the algorithm at the heart of the system, the application that uses it and the data it’s fed – but the majority of face recognition algorithms exhibit demographic differentials. A differential means that an algorithm’s ability to match two images of the same person varies from one demographic group to another.” https://www.nist.gov/news-events/news/2019/12/nist-study-evaluates-effects-race-age-sex-face-recognition-software. meaning women and people of colour are more likely to be misidentified by FRT and, therefore, potentially more likely to be wrongfully accused by police who use FRT than light-skinned men.
In addition, some authorities are more likely to apply FRT to marginalized communities which are already over-surveilled, over-policed and over-incarcerated,15ps://www.amnesty.org/en/latest/news/2023/05/israel-opt-israeli-authorities-are-using-facial-recognition-technology-to-entrench-apartheid/. meaning FRT can be used as a tool to create or deepen structural inequalities and discrimination. These biases are deeply compounded by law enforcement authorities who fail to even acknowledge that, let alone take steps to understand why, these technological biases occur, or fail to ask robust questions about the technology they purchase and deploy relevant to the demographic of people they are using it against. Misunderstandings about so-called “accuracy” figures16Gerchick, M & Cagle, M, When it Comes to Facial Recognition, There is No Such Thing as a Magic Number, American Civil Liberties Union, 7 February 2024, https://www.aclu.org/news/privacy-technology/when-it-comes-to-facial-recognition-there-is-no-such-thing-as-a-magic-number. further exacerbate these issues, which threaten people’s right to equal protection against discrimination.
vi Rights of people with disabilities
The former UN Special Rapporteur on the rights of persons with disabilities has previously documented that some FRT algorithms have inherent biases against people with disabilities and especially people with conditions such as Down syndrome, achondroplasia, cleft lip or palate, or other conditions that result in facial differences.
He reported that these issues have resulted in some people with disabilities being “judged untrustworthy” because their face did not conform to the standard programmed in the respective FRT system. The special rapporteur has called on states to consider imposing a moratorium on the sale and use of FRTs until a full audit of the effects of FRT involving representative organizations of people with disabilities can be carried out.17Report of the Special Rapporteur on the rights of persons with disabilities on Artificial Intelligence and the rights of persons with disabilities, December 2021, https://www.ohchr.org/en/calls-for-input/2021/report-special-rapporteur-rights-persons-disabilities-artificial-intelligence.
vii Right to presumption of innocence
A fundamental element of fair trials and the rule of law is that every human being is innocent until proven guilty.18Article 11, Universal Declaration of Human Rights, https://www.un.org/sites/un2.un.org/files/2021/03/udhr.pdf.
The use of FRT by a law enforcement authority requires them to run a biometric template against a reference database of biometric templates, if used retrospectively, or to run biometric templates taken from a live video feed and compare them to a “watchlist” of biometric templates if used live. These processes, by their nature, effectively necessitate the generation of multiple false matches. What this means is that ultimately, because of these systems, innocent people will always end up on lists used by law enforcement when they seek to find and/or identify a suspect or person of interest.
When a person is included in a reference database, in some cases simply because they own a driver’s licence or passport,19Boffey, D, “Police to be able to run face recognition searches on 50m driving licence holders”, The Guardian, 20 December 2023, https://www.theguardian.com/technology/2023/dec/20/police-to-be-able-to-run-face-recognition-searches-on-50m-driving-licence-holders. or on a watchlist, because of a usually unknown or entirely opaque20Big Brother Watch: complaint against private sector facial recognition, AWO Agency, https://www.awo.agency/blog/big-brother-watch-complaint-against-private-sector-facial-recognition/. See also https://www.awo.agency/files/2022-07-25-Facewatch-Coop-ICO-Complaint.pdf. set of criteria established by a law enforcement authority, this inclusion of their biometric facial data means that they will be subject to searches by police who are seeking either to find or to identify a person of interest. This effectively means that every person in a database or on a watchlist is treated as a potential criminal suspect or person of interest, thereby intruding on their right to presumption of innocence. It means many people who have nothing to do with a specific crime being investigated, for which an FRT search is carried out, could erroneously face potentially grave consequences, as has happened in the case of misidentifications.
It is becoming increasingly clear that issues around misidentification are further compounded by how a person’s image is later used in a photographic line-up.21Hill, K, “Facial Recognition Led to Wrongful Arrests. So Detroit Is Making Changes”, New York Times, 29 June 2024, https://www.nytimes.com/2024/06/29/technology/detroit-facial-recognition-false-arrests.html#:~:text=The%20person%20running%20the%20search,provided%20the%20store’s%20surveillance%20video.
If a person’s biometric template is kept in a specific reference database or watchlist routinely used or accessed by a law enforcement authority, they could be said to be trapped in a perpetual virtual line-up, even when they have no link whatsoever to a specific crime.22Garvie, C, Bedoya, A & Frankle, J, The Perpetual Line-Up, Unregulated Police Face Recognition in America, Georgetown Law Center on Privacy and Technology, October 2016, https://www.perpetuallineup.org/.
viii Right to effective remedy
Everyone has the right to an effective remedy for acts violating their fundamental rights.23Article 8, Universal Declaration of Human Rights, https://www.un.org/sites/un2.un.org/files/2021/03/udhr.pdf.
Whether a police force uses FRT in a live, retrospective or operator-initiated manner to either monitor or seek to identify a person, even when it does so legally, the system can, and does, get it wrong. This is because the technology is not designed to give police a single positive identification or “match”. Instead, at best, it gives a person using FRT a guess list of who the person could be – a list of potential candidates accompanied by similarity scores.
A threshold value is fixed to determine when the software will indicate that a probable match has occurred. Should this value be fixed too low or too high, it can create a high false positive rate or a high false negative rate respectively. There is no single threshold setting which eliminates all errors. In addition, the length of the returned candidate list will depend on the configuration set by the user.
Either way, there is no guarantee that the “true match” will be returned in the list, as the person being sought may not be in the reference database. Nor is there a guarantee that, if the person is actually in the reference database, they will be at the top of the candidate list returned; there is also no guarantee that the police officer will choose the “true match” from the list, if such a match even exists.
The initial stages of a live FRT operation are entirely based on automated processing, whereby the system runs images captured on a live feed against images on a watchlist and creates alerts for potential actions to be taken. Similar to the retrospective FRT systems, whether so-called “matches” are found depends on the selected “similarity setting”. Studies show the lower the threshold is set, the more matches will be found but the less accurate those so-called “matches” are likely to be, while a higher threshold yields fewer matches with a higher degree of confidence in their accuracy.24Fussey, P & Murray, D, Independent Report on the London Metropolitan Police Service’s Trial of Live Facial Recognition Technology, Human Rights Centre, University of Essex, July 2019, p.107, https://repository.essex.ac.uk/24946/1/London-Met-Police-Trial-of-Facial-Recognition-Tech-Report-2.pdf.
These inescapable issues with FRT give rise to innocent people who may happen to look like other people being dragged into a net of police suspicion without just cause. When people are wrongfully subjected to police action on the basis of FRT use and are neither informed that the action was contingent on FRT25MacMillan, D et al., “Police seldom disclose use of facial recognition despite false arrests”, Washington Post, 6 October 2024, https://www.washingtonpost.com/business/2024/10/06/police-facial-recognition-secret-false-arrest/.
nor able to take litigation to find out exactly how FRT was used against them,26Williams, R, “I Was Wrongfully Arrested Because of Facial Recognition Technology. It Shouldn’t Happen to Anyone Else”, Time, 29 June 2024, https://time.com/6991818/wrongfully-arrested-facial-recognition-technology-essay/. this raises serious concerns about people’s right to an effective remedy. Given this is a relatively new technology in policing, this issue is becoming more apparent as further cases of misidentification and wrongful arrests come to light.27Sanford, A, Artificial Intelligence Is Putting Innocent People at Risk of Being Incarcerated, Innocence Project, 14 February 2024, https://innocenceproject.org/artificial-intelligence-is-putting-innocent-people-at-risk-of-being-incarcerated/.
When a person is misidentified as someone accused of a crime, the impact on their life can be detrimental whether that impact is based on an action taken immediately after the misidentification, as in a live situation,28Clayton, J, “I was misidentified as shoplifter by facial recognition tech”, BBC, 26 May 2024, https://www.bbc.com/news/technology-69055945. or after some time, as in a retrospective situation.29Bhuiyan, J, “Facial recognition used after Sunglass Hut robbery led to man’s wrongful jailing, says suit”, The Guardian, 23 January 2024, https://www.theguardian.com/technology/2024/jan/22/sunglass-hut-facial-recognition-wrongful-arrest-lawsuit.
However, crucially, concerns about the right to an effective remedy do not pertain to misidentifications alone. When the technology gets it “right” and correctly identifies a person, this does not necessarily mean that the use of the system is legitimate, proportionate, necessary and compliant with human rights.30Glukhin v Russia, App. No(s),11519/20, https://hudoc.echr.coe.int/eng?i=001-225655. The use of FRT which “correctly” identifies people can still lead to the most egregious human rights abuses31Gan-Mor, G & Pinchuk, A, In Focus: Facial Recognition Tech Stories and Rights Harms from Around the World: Surveillance in the West Bank/Occupied Palestinian Territories, International Network of Civil Liberties Organizations, January 2021, p.11, https://www.inclo.net/pdf/in-focus-facial-recognition-tech-stories.pdf. See also Israel and Occupied Palestinian Territories: Automated Apartheid: How facial recognition fragments, segregates and controls Palestinians in the OPT, Amnesty International, May 2023, https://www.amnesty.org/en/documents/mde15/6701/2023/en/. See also Frenkel, S, “Israel Deploys Expansive Facial Recognition Program in Gaza”, New York Times, 27 March 2024, https://www.nytimes.com/2024/03/27/technology/israel-facial-recognition-gaza.html?ugrp=c&unlocked_article_code=1.f00.5DIt.O0vT0ELrgEOM&smid=url-share. and, when combined with other surveillance technology and weapons systems, even war crimes.32Fatafta, M, and Leufer, D, Artificial Genocidal Intelligence: how Israel is automating human rights abuses and war crimes, Access Now, 9 May 2024, https://www.accessnow.org/publication/artificial-genocidal-intelligence-israel-gaza/.
ix Right to fair trial and due process
The real-life impact of FRT use by law enforcement – whether it fails33Williams, R, I Did Nothing Wrong. I Was Arrested Anyway, American Civil Liberties Union, 15 July 2021, https://www.aclu.org/news/privacy-technology/i-did-nothing-wrong-i-was-arrested-anyway. or functions34Glukhin v Russia, App. No(s),11519/20, https://hudoc.echr.coe.int/eng?i=001-225655. – can be devastating. In the USA alone, there are, at the time of writing, seven known cases of law enforcement wrongfully arresting and incarcerating people on the basis of the police using error-prone FRT, six of whom are Black.35ACLU calls on Detroit Police Department to end use of faulty facial recognition technology following yet another wrongful arrest, American Civil Liberties Union, 7 August 2023, https://www.aclumich.org/en/press-releases/aclu-calls-detroit-police-department-end-use-faulty-facial-recognition-technology. But it is unknown how many people wrongfully arrested and incarcerated in the USA may have taken plea deals.36Press, E, “Does A.I. Lead Police to Ignore Contradictory Evidence?”, The New Yorker, 13 November 2023, https://www.newyorker.com/magazine/2023/11/20/does-a-i-lead-police-to-ignore-contradictory-evidence. If law enforcement authorities fail to disclose the use of FRT to people who have been detained, questioned, arrested, charged or prosecuted following an FRT search, this is a clear infringement of their right to due process and, in cases of prosecution, a fair trial.37MacMillan, D et al., “Police seldom disclose use of facial recognition despite false arrests”, Washington Post, 6 October 2024, https://www.washingtonpost.com/business/2024/10/06/police-facial-recognition-secret-false-arrest/.
The opacity around the use of FRT systems and how they operate can obfuscate defendants’ ability to fully understand how a case was built against them and, in some cases, deny them the means to challenge the accuracy and reliability of those systems in court. This issue is of even greater concern when a defendant is part of a demographic that suffers disproportionately due to the bias issues in FRT, and may need greater technical expertise to challenge the system’s results. All of the above results in an imbalance of access to information and power in the police–civilian dynamic that is incompatible with due process.