Facial Recognition Has Its Eye on the U.K. – Lawfare

In the United Kingdom, there is an eye in the sky surveilling people on the streetand soon it may know their names. Human rights organizations such as Big Brother Watch and Liberty, as well as British parliamentarians, are challenging what appears to be a silent rollout of facial surveillance across the United Kingdom. For Americans concerned about a developing Big Brother at home, recent judicial and regulatory developments on facial recognition technology in the U.K. may provide a glimpse into a potential future.

As the U.S. government and the U.S. public consider the potential future use and regulation of facial surveillance, the debate in the U.K. can help to inform the U.S. discussion, particularly in terms of how law enforcement may use, and can abuse, the technology. It can also offer a window into the types of legal arguments (albeit in the British context) that might be used to challenge police usage and preview potential models for the regulation of facial surveillance.

State-operated surveillance is hardly a novel phenomenon in the U.K. The first closed-circuit TV (CCTV) system in the United Kingdom was set up in 1953 in London, for the Queens coronation. By the 1960s, permanent CCTV began to cover certain London streets. Since then, the reach of CCTV surveillance has expanded in sporadic bursts, with many cameras installed in response to the 1990s IRA attacks and then again after 9/11 and the London Underground bombing. Now, there are more than 6 million CCTV cameras in the United Kingdom, more per citizen than in any country except China.

The British government argues that CCTV serves four purposes: the detection of crime and emergency incidents, the recording of events for investigations and evidence, direct surveillance of suspects, and the deterrence of crime. However, critics argue there is little evidence to support the proposition that its use has reduced levels of crime. An internal report by Londons Metropolitan Police noted that only one camera out of every 1,000 had been involved in solving a crime.

While CCTV has traditionally consisted of fixed-point, video-recording capabilities, in recent years, new technologies have greatly expanded the capabilities of surveillance. Automatic license plate readers, police body cameras and drone surveillance have created a more flexible, mobile and intelligent surveillance apparatus in the U.K.

In recent years, however, facial surveillance, or automated facial recognition (AFR), has emerged as one of the most desired surveillance tools for law enforcement. Facial recognition offers a solution to problems that have plagued police use of CCTV. In the past, successful use of CCTV had been limited because the police did not have the systems or staff to review and utilize footage. With facial recognition technology, algorithms can automatically identify and notify police of certain individuals in footage. Police could use this capability to aid traditional policing, like identifying an individual before or after an arrest or tracking the historical location of a criminal suspect. But police can also use facial recognition for more novel surveillance tactics, like real-time observation of suspects.

Facial recognition technology has been used by police in the U.K. since 1998, but its effectiveness in controlled environments has increased significantly in the past few years thanks to the significant increase in the availability of labeled facial images from social media and a new generation of computers with increased processing power. That improved recognition ability has led the Metropolitan Police and the South Wales Police to run several tests of real-time use of facial recognition within CCTV, or AFR.

Yet, in practice, facial recognition is a deeply flawed tool for policing. Despite the apparent accuracy of facial recognition in a laboratory setting, police tests in the U.K. appear to indicate that, in a live setting, the technology is anything but accurate. In 2018 and 2019, the civil liberties organization Big Brother Watch submitted a series of freedom of information requests to both the Metropolitan Police and the South Wales Police. By Big Brother Watchs analysis, the Metropolitan Police use of AFR has a false-positive rate of 98 percent. Out of the 104 times the police system matched a person to an image of a wanted criminal, 102 of the matches identified the wrong person. Only two people were identified correctly: One of the two had been erroneously placed on the wanted criminal list, and the other was on a mental-health-related watchlist.

The information provided by the South Wales Police painted a similarly stark picture of the inaccuracies of police use of facial recognition. The South Wales Police system had a false-positive rate of 91 percent. The system made 2,451 incorrect identifications and only 234 correct ones out of the 2,685 times the system matched a face to a name on the watchlist. On the basis of those false positives, South Wales Police staged interventions for 31 innocent citizens, in which they stopped individuals and asked them to provide proof of their identity.

An independent study undertaken by the University of Essex, commissioned by the Metropolitan Police, paints a slightly rosier picture, if only barely. By the reports accounting of the Metropolitan Polices tests, the system made 42 matches. Across all tests, the facial recognition matches were verifiably correct only eight times, representing 19 percent of all matches. Despite this low accuracy, and the groundless police stops it triggers, the Metropolitan Police characterized these tests as legal and successful in finding wanted offenders and said that they would continue to implement trials. Due to a lack of data on how frequently the Metropolitan Police currently undertake police stops without reasonable suspicion, it is difficult to identify whether AFR increases the rate of suspicionless stops by police. Activists are concerned that, when AFR is fully operationalized, such high rates of false positives will prompt police to undertake more stops and searches of citizens without any reasonable suspicion.

Activists also worry that facial surveillance could become an instrument of police abuse. In East London, the Metropolitan Police tested facial surveillance on citizens in a public square by attaching facial recognition-enabled cameras to an unmarked van on the street. Citizens largely passed by the cameras without remark, either not noticing or not caring. One man, however, after seeing Big Brother Watch placards about the covert test, pulled the opening of his sweater over his mouth. As he passed by the cameras, his face partially obscured, officers detained him and began to question him about why he was covering his face. Ultimately, he was releasedbut not before police photographed his face and fined him $115 for disorderly conduct.

These failures in real-time and investigatory surveillance are all the more concerning due to the lack of clarity regarding which authorities are responsible for oversight. There is no legislation in the U.K. specifically authorizing or regulating the use of AFR. Instead, the regulation of facial recognition relies on a collection of bureaucratic entities tasked with monitoring different aspects of the state surveillance apparatus.

The regulation of surveillance camera systems is controlled largely by the Protection of Freedoms Act (POFA) of 2012, the Regulation of Investigatory Powers Act (RIPA) of 2000 and the Data Protection Act (DPA) of 2018. The surveillance camera commissioner, an independent official appointed by the secretary of state, counsels the secretary and other relevant authorities on proper compliance with the surveillance camera provisions of the POFA. The biometrics commissioner, another independent official appointed by the secretary of state, regulates the use and retention of biometric data by the government, including approving police applications to retain certain biometric data. Meanwhile, the RIPA is administered by the investigatory powers commissioner, an independent appointee of the secretary of state; and the DPA is administered by the information commissioner, an independent official appointed by and reporting directly to Parliament.

Not unsurprisingly, in 2016, the surveillance camera commissioner expressed confusion about which commissioner was responsible for oversight of AFR. Most recently, this ambiguity has led to a confrontation between the surveillance camera commissioner and the information commissioner over whether AFRrequires the government to issue a new code of conduct to police, to regulate how they deploy the technology.

Since Lawfares last coverage of AFR in the U.K., British civil liberties organizations have launched multiple campaigns to resist the police use of AFR and to press for sufficient assurances of legal and ethical protections of civil liberties.

An important recent High Court case largely ended favorably for the police. With the help of the human rights organization Liberty, Ed Bridgesa former Cardiff city councillor concerned about a facial recognition camera that surveilled him while he shopped for his lunchtime sandwich and attended a peaceful protestlevied a suit challenging the legality of police surveillance. Bridges brought his suit against the South Wales Police, the home secretary, the information commissioner and the surveillance commissioner.

Bridges brought three claims against the parties. First, he alleged that the use of AFR interfered with his privacy rights under European Convention on Human Rights (ECHR) Articles 8(1) and Article 8(2) (these ECHR rights are codified in British law by the U.K. Human Rights Act of 1998), which provide a

right to respect for private and family life, home, and correspondence [without] interference by a public authority except such as is in accordance with law and is necessary ... in the interest of national security, public safety, or the economic well-being of the country, for the prevention of disorder or crime, for the protection of health or morals, or for the protection of the rights or freedoms of others.

Second, Bridges alleged that the use of AFR did not comply with the 1998 Data Protection Act, which requires personal data be processed lawfully and fairly, and with the first data protection principle of the 2018 Data Protection Act, which requires law enforcement to acquire consent or determine that the collection of sensitive data is strictly necessary to effect a law enforcement task. The suit alleged that uses of AFR did not comply with requirements under the 2018 act to assess the potential impact on personal data when a type of processing is likely to result in a high risk to the rights and freedoms of individuals.

Third, Bridgess suit alleged that the use of AFR would be likely to disproportionately misidentify, and as a result discriminate against, women and minority individuals. Bridges pointed to studies in the United States that demonstrated that facial recognition algorithms have high error rates for identification of women and ethnic minorities because of a lack of diversity in training data. Bridges argued that facial recognition algorithms in the U.K. are likely similarly biased and would likewise disproportionately misidentify women and ethnic minorities, in violation of the Public sector equality duty (149) of the Equality Act of 2010.

In its September 2019 judgment, however, the High Court did not find any such privacy violation. First, the court recognized that facial surveillance was not a superficial search and thus engaged with Article 8(1) privacy rights, but found that AFR was sufficiently authorized and regulated by police internal policies and existing common law legislation (including the 2018 DPA and the Surveillance Camera Code of Practice, issued pursuant to POFA Section 33). The court also held that AFRs interference in citizens privacy rights was sufficiently justified by the objective of identifying people of interest to the South Wales Police, its use was rationally connected to the objective, a less intrusive measure could not be substituted, and its use fairly balanced the rights of the individual and the interests of the community. Second, after analyzing the interaction of AFR with Article 8 rights, the court found that the use of AFR was being processed lawfully and fairly under the 1998 DPA. The court also held that the use of AFR was strictly necessary for identifying individuals on a watchlist and necessary for the common law duty of preventing and detecting crime. The court held that there was not sufficient evidence to suggest that the AFR tool demonstrated any discrimination or bias and that the South Wales Police had sufficiently complied with its Equality Act requirements. While the court suggested that internal policy guidelines for using AFR were likely not sufficient to ensure sensitive data processing compliant with the 2018 DPA, it did not hold that the guidelines failed to meet the compliance document requirements in Section 42(2) of the 2018 DPA. Instead, it recommended that the South Wales Police reconsider their guidelines with the direction from the information commissioner. On the basis of these analyses, Bridgess challenge was dismissed on all grounds.

On the surface, the court appeared to address many of the legal grounds for challenging facial surveillance, suggesting that U.K. police can begin to adopt facial surveillance without fear of legal reproach. However, the courts opinion is concerned primarily with the specific kind and use of AFR that Bridges challenged. The court specified that the surveillance against Bridges was minimally intrusive because it was used for only a limited time, covered a limited space, and was engaged for targeted identification. While the case could provide a valuable precedent for police departments that plan to use AFR in a similar way, the decision does not appear to authorize the form of dragnet surveillance that critics fear pervasive AFR can provide, as the court only held that the described minimally intrusive usage of AFR to date has complied with the Human Rights Act and data protection requirements. Bridges has been granted leave to appeal the judgment.

Shortly after the courts ruling, the information commissioner declared her disagreement with the High Courts decision and struck out a harsh stance against unchecked use of AFR. In May 2018, her office opened an investigation into the use of AFR by the Metropolitan Police and the South Wales Police. Her office concluded that the government should introduce a binding statutory code of practice to guide when and how AFR will be deployed. Subsequent to the Bridges decision, the information commissioner also issued a nonbinding advisory opinion on AFR that clearly disagrees with the High Court judgment and aims to mitigate its impact, specifically stating to police departments that the High Court judgement should not be seen as a blanket authorization for police forces to use [AFR] systems in all circumstances. When [AFR] is used, my opinion should be followed. At present, the Office of the Information Commissioner has stated that it is coordinating with the Home Office, the investigatory powers commissioner, the biometrics commissioner, the surveillance camera commissioner, and the police on developing an AFR code of conduct.

Against this contentious debate within government, Big Brother Watch has forged ahead with its resistance campaign. On June 13, 2018, Big Brother Watch and Baroness Jenny Jones sent pre-action letters to the Metropolitan Police and the home secretary. In the letters, Jones expressed her concerns that the police use of AFR could identify and thus interfere with confidential meetings with whistleblowers and campaigners with whom she meets regularly as part of her Parliamentary duties and that she would need to modify her conduct to avoid meeting certain individuals in an area where AFR would or might be used. She also expressed concern about the sources of images used to construct AFR watchlists, including whether images are sourced from police protest surveillance, the internet or social media.

Big Brother Watch and Jones argue that the use of AFR violates Articles 8, 10 and 11 of the ECHR. Article 8, as mentioned earlier, enumerates certain privacy rights. Articles 10 and 11 declare the individual rights to freedom of expression and freedom of peaceful assembly and association, which mirror the First Amendment to the U.S. Constitution. The letters allege that use of AFR interferes with these rights by retaining sensitive biometric data and location information for an indeterminate period of time (police reports have varied on that subject).

In July 2019, the Metropolitan Police announced that it had completed its trials of AFR and was considering potential implementation. Big Brother Watch chose to stay its challenge, to see if the Metropolitan Police would voluntarily end or restrict its usage. The Metropolitan Police has now chosen to begin operational usage of AFR, targeted at serious crime like serious violence, gun and knife crime, [and] child sexual exploitation. In response, Big Brother Watch declared that it will continue its legal challenge. It is also likely to continue its work with Lord Clement-Jones, chair of the House of Lords Artificial Intelligence Committee, to propose a bill that would place a moratorium on the use of AFR in public places and require the secretary of state to undertake a review of the use of AFR in public places.

The state of facial surveillance in the United Kingdom can give Americans a glimpse of how AFR might be implemented and used in the United States. The adjudication of the Big Brother Watch challenge, much like the Bridges decision, might provide some new perspective on what kinds of uses of AFR do or dont match up to legal standards for privacy (albeit European ones). At this time, despite the U.S. Supreme Courts comments in Jones and Carpenter that individuals hold some privacy interest in the sum of their movements in public, the Fourth Amendment does not appear to regulate the use of AFR. If U.S. courts refrain from extending Fourth Amendment protections to particular uses of AFR, the regulatory regime that emerges in the United Kingdom could provide helpful inspiration for regulating the use of AFR in the United States.

See the original post:

Facial Recognition Has Its Eye on the U.K. - Lawfare

Related Posts

Comments are closed.