Study finds self-driving cars less likely to detect dark-skinned people
A recent study has found self-driving cars to be racist too and tend to have trouble detecting darker skin tones as compared to white ones.
Researchers from Georgia Institute of Technology recently analyzed some data and discovered that systems used by autonomous vehicles through which they detect pedestrians had trouble picking out people that had darker skin tones.
According to Mashable, the researchers looked at footage from the Berkeley Driving Dataset that contained footage from New York, San Francisco, Berkeley and San Jose. Through the videos, the team was able to look at how systems would react to various types of pedestrians.
Survey says consumers losing trust in self-driving cars after crashes
The team looked at eight image recognition systems generally used in self-driving vehicles and calculated how each of them picked up skin tone, as measured on the Fitzpatrick skin type scale, a way of classifying human skin color.
The researchers concluded ‘uniformly poorer performance of those systems when detecting pedestrians with Fitzpatrick skin types between 4 and 6’, which are referred to as darker skin types. If the system fails to identify a person as a pedestrian, they are greatly at a risk of being hit since the computer will be unable to predict their behavior.
However, darker skin tone itself cannot be solely linked to bad performance. Other factors including day time or clothing color can also lead to inaccurate outcomes. But, when solely based on skin color, the accuracy rate dropped an average of 5% for pedestrians with darker skin even after controlling time of day and obstructed view, reported MIT.
Last year, tech firms including Amazon, Microsoft and IBM too were called out for using facial recognition technology that was found to be biased against people having darker skin tones.
Comments
Comments are closed.