{"id":1117792,"date":"2023-09-15T10:11:31","date_gmt":"2023-09-15T14:11:31","guid":{"rendered":"https:\/\/www.euvolution.com\/prometheism-transhumanism-posthumanism\/uncategorized\/facial-recognition-technology-and-false-arrests-should-black-capital-b\/"},"modified":"2023-09-15T10:11:31","modified_gmt":"2023-09-15T14:11:31","slug":"facial-recognition-technology-and-false-arrests-should-black-capital-b","status":"publish","type":"post","link":"https:\/\/www.euvolution.com\/prometheism-transhumanism-posthumanism\/fourth-amendment\/facial-recognition-technology-and-false-arrests-should-black-capital-b\/","title":{"rendered":"Facial Recognition Technology and False Arrests: Should Black &#8230; &#8211; Capital B"},"content":{"rendered":"<p><p>    Technological advancements such as location tracking and DNA    testing over the past 20 years have contributed to law    enforcements ability to close a criminal investigation. But    their use of facial recognition software in recent years has    resulted in the wrongful arrests of seven Black people     foreshadowing another potential form of racial discrimination    in the criminal justice system, critics say.  <\/p>\n<p>    Facial recognition is one of those things that we jumped on    too quickly and it kind of just took over before we even knew    it, said Thaddeus Johnson, an assistant professor of    criminology and criminal justice at Georgia State University    who, along with his colleague Natasha Johnson, published the    only empirical research on     facial recognition last October.  <\/p>\n<p>    Like any form of technology, facial recognition  a form of    artificial intelligence  likely will improve as it updates and    evolves. Law enforcements use of it without thorough     empirical research, however, may continue to be a threat    against Black people and those with darker skin tones because    the technology is unable to accurately distinguish facial    features of different races.  <\/p>\n<p>    Johnson and Safiya Noble, director of the Center on Race and    Digital Justice and     author of Algorithms of Oppression: How Search Engines    Reinforce Racism, help us understand why they are sounding the    alarm about law enforcements use of facial recognition.  <\/p>\n<p>        Sign up for our newsletter and get more news and analysis        from our reporters weekly.      <\/p>\n<p>    The     first facial recognition technology was developed in the    1960s by     Woodrow Wilson Bledsoe, Helen Chan Wolf, and Charles    Bisson, who had an idea to have computers programmed to    recognize faces. Bledsoe, a mathematician, received funding    from the CIA to create the system,     Observer reported. They programmed 10 photographs of    different people, most likely white, into a database and    trained the computer to learn how to divide a face into    features, then compare the distances between those features to    determine a specific face.  <\/p>\n<p>    Over the next 60 years, facial recognition became more    sophisticated in identifying skin textures and using 3D images.    Now, the software has the ability to comb through more than        15 million profiles in the FBIs National DNA Index System,    as well as databases created by facial recognition companies    that scrub the internet and social media for the faces of    billions of people.  <\/p>\n<p>    Those software upgrades also contributed to todays biometric    screenings or fingerprint access to cellphone applications and    ATM machines. Home security systems, closed-circuit    surveillance, computers  any equipment with a built-in camera    can have facial recognition software installed.  <\/p>\n<p>    But when it comes to accurately detecting darker skin tones,    the technology hasnt made significant improvements.  <\/p>\n<p>    The kinds of people who are often making software, making    products coming out of tech corridors around the world, have    limited worldviews and lack exposure to lots of different kinds    of people  and we see that in every industry, Noble said.  <\/p>\n<p>    Johnson said that facial recognition softwares algorithms are    more than likely unconsciously biased to recognize features    familiar to their programmer, who is most likely a     white man. When the program is put to work in real life, it    is most likely comparing images to databases that contain more    Black and brown faces than the white ones its trained to    recognize.  <\/p>\n<p>    Without cultural education or exposure to different races and    ethnicities, Noble said, software programmers will continue to    create flawed facial recognition technology that in the long    run will do more harm than good.  <\/p>\n<p>    Facial recognition wasnt tested in the real world until 2001,    when federal and local law enforcement in Floridas Tampa Bay    area used it during that years     Super Bowl. Its unclear why they decided to experiment at    this event rather than other largely attended events such as    New Years Eve in Times Square.  <\/p>\n<p>    As the crowd of     71,921 fans entered the stadium, people stood still for    their picture to be taken. Without their knowledge, the    photographs were entered into a database seeking matches for    criminal suspects. During the event, the system detected 19    people with outstanding warrants, but police were not prepared    to make those arrests, a detective told     The New York Times at the time.  <\/p>\n<p>    That same year, the city of Tampa accepted a free, one-year    trial of the facial recognition software used during the Super    Bowl. City officials set up face scanners in their downtown    entertainment district but did not find them to be effective    because the program didnt have a database to compare images    to, and the software couldnt keep up with trying to scan    moving images on a public street,     Vice reported.  <\/p>\n<p>    The flaws in facial recognition technologies havent stopped    law enforcement and customer-service based industries from    continuing to use it.  <\/p>\n<p>    Airports, businesses, social media, marketing, and cellphone    companies use facial recognition technology for a variety of    reasons that can be as insignificant as allowing users of an    app to apply filters on photographs.  <\/p>\n<p>    This year, the Transportation Security Administration announced    that it will expand its facial recognition program to more than    400 airports across the country in the coming years. The pilot    program, which is currently in 25 airports, has a 97% effective    facial matching algorithm across demographics, including dark    skin tones, a TSA press secretary told     Fast Company in June.  <\/p>\n<p>    Clearview AI is a facial recognition company that provides    software to law enforcement and government agencies. Its    collection of images amounts to a mega police lineup, critics    told     Business Insider in April. Clearview AI says it has    collected     30 billion images from the internet, Facebook, and other    social media  without permission from the social media    companies. Cease-and-desist letters were sent by Facebook and    other social media companies to Clearview AI for violating    users privacy.  <\/p>\n<p>    Critics are also concerned with threats of cybersecurity    hackers maliciously breaking into facial recognition databases    to steal personal information.  <\/p>\n<p>    But Noble said whether we like it or not, everybodys face is    in facial recognition databases with or without their consent    if they are on social media. If they have any photos of    themselves up anywhere online, including photos they did not    post of themselves but that others posted, those are all    available to  a variety of different kinds of agencies.  <\/p>\n<p>    Johnson said that facial recognition is a very good tool for    getting a lead into solving a crime, and its law enforcement    use should be restricted to case detectives and investigators.    But the problem is we are so blindly trusting AI that    generally the police just use it.  Thats why there needs to    be regulations, he said.  <\/p>\n<p>    We are not sure if theyre calibrating their equipment    correctly. Were not sure of the training of the people who are    using these technologies. What about officers who have    body-worn cameras on thats doing this real-time recording but    are also equipped with this mobile facial recognition? [The    officers are] basically a walking and talking constitutional    violation of sorts, Johnson said.  <\/p>\n<p>    The most well-known case where facial recognition was a leading    contributor to accurately identifying suspects was following    the Jan. 6, 2021, insurrection upon the U.S. Capitol in    Washington, D.C. Federal law enforcement officials were able to    identify well over 1,000 mostly white people accused of    breaching the U.S. Capitol and assaulting several law    enforcement officers,     The Washington Post reported. Investigators used facial    recognition technology to match the suspects images from that    day to photographs and videos found of them on social media. In    some cases, a states Department of Motor Vehicles database of    drivers license photos were used to match suspects.  <\/p>\n<p>    There are no reports of any of the Jan. 6 suspects filing a    wrongful arrest lawsuit due to the use of facial    recognition.  <\/p>\n<p>    Legal experts saw the Super Bowl debut of facial recognition    technology as a violation of privacy. Those Fourth Amendment    concerns persist more than 20 years later, especially since    there havent been any proposed federal regulations on how to    use the technology without violating individuals civil    rights.  <\/p>\n<p>    The White Houses Office of Science and Technology Policy        released in October 2022 a nonbinding Blueprint for an AI    Bill of Rights that provides five principles on the design,    use and deployment of automated systems to protect the American    public in the age of artificial intelligence.  <\/p>\n<p>    But in a December 2022     conversation hosted by the Brookings Center for Technology    Innovation, legal experts criticized the White Houses    initiative for leaving out guidance for law enforcement    agencies use of artificial intelligence, specifically facial    recognition.  <\/p>\n<p>    Excluding law enforcement may continue the oversurveillance of    certain populations, communities, and individuals under the    guise of public safety and national security and will not    necessarily reduce the history and manifestation of rampant    discrimination against people of color and immigrants. If law    enforcement were included in the Blueprint provisions and    guidance, it could have offered new guardrails and agency for    individuals left with little recourse when misidentified and\/or    scrutinized by existing and emerging AI technologies,    according to     commentary of the Brookings Center for Technology    Innovations online event.  <\/p>\n<p>    The Jan. 6 investigation could imply that facial recognition    works well, but if it continues to misidentify Black people or    individuals with darker skin tones, it does not, critics say.    The 2018 Gender    Shades study showed that off-the-shelf facial recognition    software systems that companies and law enforcement use have    low efficacy when it comes to detecting Black womens faces,    and Black people in general, but are more reliable for white    mens faces.  <\/p>\n<p>    There are already practices and policies that are inequitable    and result in inequitable outcomes. Why the hell do we think    that facial recognition technology will make that better? No,    it only exacerbates those things, said Johnson, who was    previously an acting police captain in Memphis, Tennessee.  <\/p>\n<p>    Though there arent any reported cases of a wrongful conviction    connected to the use of facial recognition, since 2018 there    have been six Black men and a Black woman who have been    subjected to days in jail after a facial recognition match    falsely connected them to felony-level crimes. In the years to    follow, police departments within predominantly Black cities in        Louisiana,     Maryland,     Michigan and     New Jersey have been accused of and sued for false arrests    due to the use of facial recognition technology.  <\/p>\n<p>    The number of people who are ensnared relative to the millions    of people for whom theres no problem means that the seven    people who are falsely accused or imprisoned are just kind of    like collateral damage to these companies, Noble said. And    Im sure they do their calculus on it and say, Well, if we    have to settle some lawsuits, its cheaper than redesigning the    product. So we become  our communities become  the    collateral damage.  <\/p>\n<p>    Apple Inc. was one of the first business entities slapped with    a wrongful arrest lawsuit that stemmed from the use of facial    recognition to identify a possible suspect in a string of store    robberies throughout the Northeast. Ousmane Bah, an 18-year-old    college student in New York, sued the tech company for $1    billion after he said he was falsely arrested in 2018. The New    York Police Department made the arrest based on a photograph of    the possible suspect Apple turned over to police. The police    allegedly agreed that the person in the picture did not look    like Bah,     Business Insider reported. The lawsuit was voluntarily    dismissed, with prejudice against the defendant(s) Apple Inc.    in 2021, according to online federal court records.  <\/p>\n<p>    One of three cases out of the Detroit Police Department was    that of     Porcha Woodruff, who at the time of her arrest was eight    months pregnant and questioned for 11 hours about robbery and    carjacking accusations she knew nothing about. Woodruff,        Robert Williams, and     Michael Oliver had similar experiences with Detroit police    and are each suing.  <\/p>\n<p>    I think we should have a moratorium on facial recognition    technologies until it can be determined that they are safe and    used in ways that are safe. There are many people who think    that facial recognition technologies, myself included, should    be made illegal because theyre too consequential in the    current ways that theyre used.  Bans on facial recognition is    actually a public safety imperative, Noble said.  <\/p>\n<p>    In March, Democratic Reps. Pramila Jayapal of Washington state    and Edward Markey of Massachusetts     reintroduced the Facial Recognition and Biometric    Technology Moratorium Act to the House. The bill would place a    moratorium on law enforcement use of facial recognition until    policymakers create regulations and standards that protect    constitutional rights and public safety. This is the third time    the bill has been presented.  <\/p>\n<p>    In the interim, several cities across California and    Massachusetts, including     San Francisco and     Boston, have passed laws that ban or restrict law    enforcement from using facial recognition technology.  <\/p>\n<p>        Virginia and     New Orleans reversed their short-lived facial recognition    bans. In Virginia, lawmakers used the eight-month ban to    evaluate the technology and create policies that include having    corroborating evidence with a facial recognition match before    pursuing the match as a lead.  <\/p>\n<p>    Johnson said he is currently working on research that explores    the possibility of facial recognition being used to further    assist in solving crimes and perhaps put an end to the    no-snitching culture. Violent crimes such as     murder, sexual assault, and hate crimes tend to go    unreported and unsolved in Black and brown communities because    of historic     distrust of the criminal justice system and fear of    retaliation.  <\/p>\n<p>    Theoretically, Johnson said, facial recognition technology can    help identify witnesses and victims of crime and amplify the    work of police departments across the country, if used    correctly.  <\/p>\n<p>    It [facial recognition] should be helpful, but we just dont    have enough research, and Ive cautioned against wildly    deploying these things and doing so without even having an    inkling of an idea if it has any public safety value,    scientifically, Johnson said.  <\/p>\n<p>    Capital B is a nonprofit news organization    dedicated to uncovering important stories  like this one     about how Black people experience America today. As more and    more important information disappears behind paywalls, its    crucial that we keep our    journalismaccessibleandfree    for all. But we cant publish pieces like this without    your help. If you support our mission, please consider becoming    a member by making a tax-deductible    donation.Thank you!  <\/p>\n<p><!-- Auto Generated --><\/p>\n<p>Read more from the original source:<br \/>\n<a target=\"_blank\" href=\"https:\/\/capitalbnews.org\/facial-recognition-wrongful-arrests\" title=\"Facial Recognition Technology and False Arrests: Should Black ... - Capital B\" rel=\"noopener\">Facial Recognition Technology and False Arrests: Should Black ... - Capital B<\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p> Technological advancements such as location tracking and DNA testing over the past 20 years have contributed to law enforcements ability to close a criminal investigation. But their use of facial recognition software in recent years has resulted in the wrongful arrests of seven Black people foreshadowing another potential form of racial discrimination in the criminal justice system, critics say <a href=\"https:\/\/www.euvolution.com\/prometheism-transhumanism-posthumanism\/fourth-amendment\/facial-recognition-technology-and-false-arrests-should-black-capital-b\/\">Continue reading <span class=\"meta-nav\">&rarr;<\/span><\/a><\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[94879],"tags":[],"class_list":["post-1117792","post","type-post","status-publish","format-standard","hentry","category-fourth-amendment"],"_links":{"self":[{"href":"https:\/\/www.euvolution.com\/prometheism-transhumanism-posthumanism\/wp-json\/wp\/v2\/posts\/1117792"}],"collection":[{"href":"https:\/\/www.euvolution.com\/prometheism-transhumanism-posthumanism\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.euvolution.com\/prometheism-transhumanism-posthumanism\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.euvolution.com\/prometheism-transhumanism-posthumanism\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.euvolution.com\/prometheism-transhumanism-posthumanism\/wp-json\/wp\/v2\/comments?post=1117792"}],"version-history":[{"count":0,"href":"https:\/\/www.euvolution.com\/prometheism-transhumanism-posthumanism\/wp-json\/wp\/v2\/posts\/1117792\/revisions"}],"wp:attachment":[{"href":"https:\/\/www.euvolution.com\/prometheism-transhumanism-posthumanism\/wp-json\/wp\/v2\/media?parent=1117792"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.euvolution.com\/prometheism-transhumanism-posthumanism\/wp-json\/wp\/v2\/categories?post=1117792"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.euvolution.com\/prometheism-transhumanism-posthumanism\/wp-json\/wp\/v2\/tags?post=1117792"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}