Posted by
Valentine Nev
FACIAL RECOGNITION SOFTWARE HAS A FAULT IN IDENTIFYING OTHER RACE APART FROM CAUCASIAN
African Americans and Asians were 100 times more likely to be misidentified than Caucasians.
Facial recognition systems exhibited bias when it comes to identifying and matching individuals of color, a new federal study shows.
The landmark study shines a negative light on software that is increasingly being used by law enforcement agencies around the country.
African Americans, Asian 100 times more likely to be misidentified
The National Institute of Standards and Technology study found African American and Asian people were up to 100 times more likely to be misidentified through facial recognition software than Caucasians depending on the individual algorithm. Among algorithms developed in the U.S., the American Indian demographic had the highest rates of false positives.
The study also found African American women had the highest rates of false positives for one-to-one matching which is commonly used by law enforcement to search millions of people in a database to find a suspect. The NIST test only used one FBI database containing 1.6 million domestic mugshots.
"Differentials in false positives in one-to-many matching are particularly important because the consequences could include false accusations," the NIST said in a press release highlighting the results of the study. The NIST noted the results varied from one algorithm to another, saying the "most equitable also rank among the most accurate."
NIST looked at 189 algorithms
NIST conducted the study through its Face Recognition Vendor Test program, in which it evaluates face recognition algorithms from software companies and academic developers on their ability to perform tasks. In this study, it used four collections of photographs amounting to 18.27 million images of 8.49 million individuals. All of the images came from the State Department, Department of Homeland Security and the FBI.
The NIST tested 189 software algorithms submitted by 99 developers, the majority of which were businesses. NIST looked at how well the algorithms matched photos of the same person, which is commonly used to unlock a smartphone or check a passport and is known as "one-to-one" matching. The government agency also tested the algorithm's ability to match a person in a photo to images in a database. Known as "one-to-many" matching, it can be used to identify a person of interest.
“In a one-to-one search, a false negative might be merely an inconvenience — you can’t get into your phone, but the issue can usually be remediated by a second attempt,” Patrick Grother, a NIST computer scientist and the report’s primary author. “But a false positive in a one-to-many search puts an incorrect match on a list of candidates that warrant further scrutiny.”
Comments
Post a Comment
๐ธ๐ธ๐น๐น