AGL 40.00 No Change ▼ 0.00 (0%)
AIRLINK 129.00 Decreased By ▼ -0.53 (-0.41%)
BOP 6.76 Increased By ▲ 0.08 (1.2%)
CNERGY 4.50 Decreased By ▼ -0.13 (-2.81%)
DCL 8.70 Decreased By ▼ -0.24 (-2.68%)
DFML 41.00 Decreased By ▼ -0.69 (-1.66%)
DGKC 81.30 Decreased By ▼ -2.47 (-2.95%)
FCCL 32.68 Decreased By ▼ -0.09 (-0.27%)
FFBL 74.25 Decreased By ▼ -1.22 (-1.62%)
FFL 11.75 Increased By ▲ 0.28 (2.44%)
HUBC 110.03 Decreased By ▼ -0.52 (-0.47%)
HUMNL 13.80 Decreased By ▼ -0.76 (-5.22%)
KEL 5.29 Decreased By ▼ -0.10 (-1.86%)
KOSM 7.63 Decreased By ▼ -0.77 (-9.17%)
MLCF 38.35 Decreased By ▼ -1.44 (-3.62%)
NBP 63.70 Increased By ▲ 3.41 (5.66%)
OGDC 194.88 Decreased By ▼ -4.78 (-2.39%)
PAEL 25.75 Decreased By ▼ -0.90 (-3.38%)
PIBTL 7.37 Decreased By ▼ -0.29 (-3.79%)
PPL 155.74 Decreased By ▼ -2.18 (-1.38%)
PRL 25.70 Decreased By ▼ -1.03 (-3.85%)
PTC 17.56 Decreased By ▼ -0.90 (-4.88%)
SEARL 78.71 Decreased By ▼ -3.73 (-4.52%)
TELE 7.88 Decreased By ▼ -0.43 (-5.17%)
TOMCL 33.61 Decreased By ▼ -0.90 (-2.61%)
TPLP 8.41 Decreased By ▼ -0.65 (-7.17%)
TREET 16.26 Decreased By ▼ -1.21 (-6.93%)
TRG 58.60 Decreased By ▼ -2.72 (-4.44%)
UNITY 27.51 Increased By ▲ 0.08 (0.29%)
WTL 1.41 Increased By ▲ 0.03 (2.17%)
BR100 10,450 Increased By 43.4 (0.42%)
BR30 31,209 Decreased By -504.2 (-1.59%)
KSE100 97,798 Increased By 469.8 (0.48%)
KSE30 30,481 Increased By 288.3 (0.95%)
Technology

Study finds self-driving cars less likely to detect dark-skinned people

A recent study has found self-driving cars to be racist too and tend to have trouble detecting darker skin tones as
Published March 5, 2019

A recent study has found self-driving cars to be racist too and tend to have trouble detecting darker skin tones as compared to white ones.

Researchers from Georgia Institute of Technology recently analyzed some data and discovered that systems used by autonomous vehicles through which they detect pedestrians had trouble picking out people that had darker skin tones.

According to Mashable, the researchers looked at footage from the Berkeley Driving Dataset that contained footage from New York, San Francisco, Berkeley and San Jose. Through the videos, the team was able to look at how systems would react to various types of pedestrians.

Survey says consumers losing trust in self-driving cars after crashes

The team looked at eight image recognition systems generally used in self-driving vehicles and calculated how each of them picked up skin tone, as measured on the Fitzpatrick skin type scale, a way of classifying human skin color.

The researchers concluded ‘uniformly poorer performance of those systems when detecting pedestrians with Fitzpatrick skin types between 4 and 6’, which are referred to as darker skin types. If the system fails to identify a person as a pedestrian, they are greatly at a risk of being hit since the computer will be unable to predict their behavior.

However, darker skin tone itself cannot be solely linked to bad performance. Other factors including day time or clothing color can also lead to inaccurate outcomes. But, when solely based on skin color, the accuracy rate dropped an average of 5% for pedestrians with darker skin even after controlling time of day and obstructed view, reported MIT.

Last year, tech firms including Amazon, Microsoft and IBM too were called out for using facial recognition technology that was found to be biased against people having darker skin tones.

Copyright Business Recorder, 2019

Comments

Comments are closed.