Artificial intelligence tools can contain many of the same biases that humans do — whether it be search engines, dating apps, or even job hiring software. The problem also exists in systems with even more dire consequences — specifically, the criminal justice system.Facial recognition software is far from perfect, and we’ve seen how it can be worse for dark-skinned individuals. Combine this with law enforcement’s increasing use of face detection software and it creates a gruesome intersection. Randal Reid spent a week in jail for committing a crime in a state he had never been to. Porcha Woodruff was arrested for carjacking despite being very pregnant and in no condition to carjack. Robert Williams, the first documented person to be wrongfully arrested due to facial recognition tech, was accused of stealing thousands of dollars worth of watches. At the time of the crime, he was driving home.

    











    
        


    











    
        
Facial recognition occasionally misidentifies people who are white, but it overwhelmingly misidentifies women and people of color.Confirmation Bias Makes The Facial Recognition Problem WorseAt the heart of this technology issue are some very human problems. For one: confirmation bias. Mitha Nandagopalan, a staff attorney with the Innocence Project, notes how in the case of Porcha, a visibly pregnant woman accused of carjacking,[...]
  • RedFox
    link
    fedilink
    English
    arrow-up
    1
    ·
    6 months ago

    I don’t understand why misidentifying non white people continues. Why can we make an algorithm that gets people right regardless of ethnicity or race?

    I know most people’s attitudes are to stop all together, but I think we’re past that. I don’t see that happening, so the least we could do is get it right.