Eyes on the Watchers: Challenging the Rise of Police Facial Recognition

Principles to reduce the human rights harms of FRT

About this Project

Governments worldwide are racing to either introduce or expand their use of policing facial recognition technology (FRT), a tool that can enable mass surveillance, with little regard for the risks involved. In response, INCLO has created these FRT Principles to provide a foundation for understanding the risks associated with FRT and serve as a tool for assessment and advocacy for those seeking to challenge its use.

Our principles are focused on the use of FRT by police for the purpose of identification. We believe they are valuable to civil society, policymakers, legislators, the public, media, courts, and law enforcement.

The principles

1

Law enforcement authorities must not use FRT without a specific legal basis.

2

Mandatory fundamental rights impact assessments.

3

Fundamental rights impact assessment must be independent of vendor assessment.

4

No acquisition or deployment of any new FRT without a guarantee of future independence from the vendor.

5

All versions of all assessments must be made public before the FRT deployment.

6

Obligation of public consultation

7

Authorities must inform the public how probe images are used in an FRT operation.

8

The technical specifications of any FRT system must be made public before deployment.

9

Live FRT is prohibited.

10

Mandatory prior judicial authorization.

11

Authorities must document each retrospective or operator-initiated FRT search.

12

An FRT result alone is not a sufficient basis for questioning, arrest or detention.

13

Mandatory disclosure of the details of the FRT operation applied against individuals.

14

Any FRT misidentification of a person must be reported.

15

Mandatory annual reporting by authorities of misidentifications.

16

An independent FRT oversight body must be established before any deployment of FRT.

17

Independent FRT oversight body must publish annual reports.

18

Impact assessments must be made available to the oversight body before the system is deployed.

What is FRT?

FRT is a form of biometric artificial intelligence that police use to attempt to identify or characterize a person on the basis of their facial features.

The current $5 billion FRT industry is estimated to grow to $50 billion by 2030.

What is FRT used for?

Verification

Identification

Facial attribute classification, estimation and detection

Facial emotion recognition

Steps of FRT

Facial recognition is a very complex process involving various factors. However, we can simplify it to the following four steps.

  1. Face detection: Localization of a face or faces in an image or video and, if any, return of the coordinates of the boxes bounding each of them.
  2. Face alignment: Modification of the face input (such as scaling or cropping), based on its geometric features, to adapt it to a canonical form in order to allow it to be compared against a database or watchlist of facial images.
  3. Face representation: Transformation of the pixels in the image into inputs that are useful for computer comparison. This may be a set of templates or, depending on the technique, features or shapes.
  4. Face matching: Comparison of the input obtained in the previous step against the reference database to assess, with a probability score, the verification, identification or categorization of the face.

Face Detection

Face Alignment

Face Matching

Face Representation

Types of facial recognition technology

For the purposes of these principles, we refer to live facial recognition, retrospective facial recognition and operator-initiated facial recognition

The main input of police FRT systems are images of unidentified people's faces that serve as 'probe images'. Probe images could come from a variety of different sources such as social media, CCTV or bodycam footage. They vary in condition, depending on the pose of the person in the image, the lighting, angle, and pixelation of the image. All these factors affect the reliability of an FRT search.

Our principles do not promote the use of policing FRT, but rather map existing minimum accountability and harm-mitigation standards. They serve as a tool to build consensus around the significant problems posed by FRT and the need for significant restrictions and bans.

FRT systems have the ability to strip people of their anonymity, reducing them to walking licence plates. This inevitably tilts the balance of powers in police–civilian interactions further towards police.

Why is FRT unreliable and discriminatory?

Accuracy figures put forward by FRT advocates are often based on pristine laboratory conditions that do not reflect the real-life conditions in which this technology is used. In addition, scientific studies have demonstrated racial and gender biases in FRT, meaning women and people of colour are more likely to be misidentified than light-skinned men. But even if all policing FRT systems were accurate 100 percent of the time, the risks for people’s fundamental human rights would multiply significantly.

Significant transparency issues with policing FRT include the public not being informed that authorities in their jurisdiction are using the technology and the non-disclosure of information, about the use of FRT, to individuals detained, questioned, arrested, charged, or prosecuted based on the results of an FRT search.

Risks and harms

FRT use by police and law enforcement authorities impact human rights on many different levels. There are at least nine rights that are infringed the use of FRT.

Impact assessment

Questions for policymakers, law enforcement authorities and all other stakeholders involved in the acquisition and deployment of FRT to consider as a bare minimum for harm reduction.

According to international human rights law, our rights can only be restricted or limited as long as the restriction is provided by law; is not arbitrary; pursues a legitimate aim; is strictly necessary in a democratic society; and is proportional to the legitimate aim.

There are several steps involved in the use of FRT and each stage presents opportunities for risks of error/misidentification and discriminatory, disproportionate and unnecessary surveillance. Given FRT has clearly demonstrated racial and gender biases, these biases must always be remembered when analyzing each one of the following considerations.

Humans do not always redress FRT errors. Michael Oliver has a face tattoo. He was wrongfully arrested and detained for almost three days in Detroit after an FRT search returned him as a suspect and an eyewitness picked him out of a photo line-up, all despite the photo of the suspect displaying no face tattoo.

While the US Federal Bureau of Investigation (FBI) has carried out tens of thousands of FRT searches over recent years, just 5 percent of its 200 agents who use the technology have taken the FBI’s own course on how to use it. This is not an isolated case.

The very communities disproportionately affected by error-prone technology are not consulted in a transparent manner about the tech, how it works and how it impacts the criminal justice system and people’s lives and fundamental rights. Here are some questions to consider.

In the UK, when live FRT is being used the police are supposed to alert the public to its use. However, this often happens on social media, which is not a sufficient way of alerting the public because not everyone is on these platforms.

Legal defence teams often face barriers in accessing any information about how FRT systems work, their propensity for error or bias and even the name of the system used against their client and that led to the arrest.

How to use these principles

In the face of the ever-expanding use of biometric recognition technologies by police forces across the world, the aim of these principles is to both help reduce the harms of FRT on our personal rights and to empower civil society and the general population.

We hope this research can serve to inform policy makers and voices opposed to the deployment of FRT. Specifically, we believe the principles can be used in the following four ways.