The UK Knew Its Passport AI Was Racist and Used It Anyway – Futurism

Sufficient

The U.K. government uses facial recognition AI to check travelers photos when they apply for passports. It works just fine for white people, but like so many algorithms out there, it doesnt work well when presented with dark skin.

Anti-black bias in tech is nothing new, unfortunately: algorithms trained on biased data have often resulted in software that perpetuates prejudice. Whats particularly troubling about this passport photo AI is that the British government knew about the problem, according to New Scientist but decided it was okay to deploy the system anyway.

Newly-released internal documents revealed that the same racial disparities occurred during testing, resulting in the system telling darker-skinned people that their pictures didnt comply with passport guidelines, New Scientist reports.

User research was carried out with a wide range of ethnic groups and did identify that people with very light or very dark skin found it difficult to provide an acceptable passport photograph, read the documents. However; the overall performance was judged sufficient to deploy.

If someone applying for a passport is told that their photo isnt acceptable, they can still circumvent the AI system and submit it anyway buttheyll face warnings that it could interfere with their application.

Even with the user being able to override the selection, it is still creating a largely racialized disparity in experience between users, University of Washington engineer Os Keyes told New Scientist.

READ MORE: UK launched passport photo checker it knew would fail with dark skin [New Scientist]

More on facial recognition: Google Contractors Tricked Homeless Black People Into Face Scans

Read the original here:

The UK Knew Its Passport AI Was Racist and Used It Anyway - Futurism

Related Posts

Comments are closed.