AGL 38.35 Increased By ▲ 0.33 (0.87%)
AIRLINK 212.80 Increased By ▲ 15.44 (7.82%)
BOP 9.82 Increased By ▲ 0.28 (2.94%)
CNERGY 6.47 Increased By ▲ 0.56 (9.48%)
DCL 9.16 Increased By ▲ 0.34 (3.85%)
DFML 38.35 Increased By ▲ 2.61 (7.3%)
DGKC 100.55 Increased By ▲ 3.69 (3.81%)
FCCL 36.00 Increased By ▲ 0.75 (2.13%)
FFBL 88.94 Increased By ▲ 6.64 (8.07%)
FFL 14.49 Increased By ▲ 1.32 (10.02%)
HUBC 134.45 Increased By ▲ 6.90 (5.41%)
HUMNL 13.63 Increased By ▲ 0.13 (0.96%)
KEL 5.65 Increased By ▲ 0.33 (6.2%)
KOSM 7.28 Increased By ▲ 0.28 (4%)
MLCF 45.30 Increased By ▲ 0.60 (1.34%)
NBP 61.39 Decreased By ▼ -0.03 (-0.05%)
OGDC 232.10 Increased By ▲ 17.43 (8.12%)
PAEL 41.15 Increased By ▲ 2.36 (6.08%)
PIBTL 8.55 Increased By ▲ 0.30 (3.64%)
PPL 204.29 Increased By ▲ 11.21 (5.81%)
PRL 39.95 Increased By ▲ 1.29 (3.34%)
PTC 27.70 Increased By ▲ 1.90 (7.36%)
SEARL 107.99 Increased By ▲ 4.39 (4.24%)
TELE 8.70 Increased By ▲ 0.40 (4.82%)
TOMCL 36.40 Increased By ▲ 1.40 (4%)
TPLP 13.86 Increased By ▲ 0.56 (4.21%)
TREET 24.38 Increased By ▲ 2.22 (10.02%)
TRG 61.15 Increased By ▲ 5.56 (10%)
UNITY 34.66 Increased By ▲ 1.69 (5.13%)
WTL 1.72 Increased By ▲ 0.12 (7.5%)
BR100 12,238 Increased By 511.8 (4.36%)
BR30 38,293 Increased By 1916.1 (5.27%)
KSE100 113,909 Increased By 4396.1 (4.01%)
KSE30 36,051 Increased By 1537.1 (4.45%)
Technology

Study finds self-driving cars less likely to detect dark-skinned people

A recent study has found self-driving cars to be racist too and tend to have trouble detecting darker skin tones as
Published March 5, 2019

A recent study has found self-driving cars to be racist too and tend to have trouble detecting darker skin tones as compared to white ones.

Researchers from Georgia Institute of Technology recently analyzed some data and discovered that systems used by autonomous vehicles through which they detect pedestrians had trouble picking out people that had darker skin tones.

According to Mashable, the researchers looked at footage from the Berkeley Driving Dataset that contained footage from New York, San Francisco, Berkeley and San Jose. Through the videos, the team was able to look at how systems would react to various types of pedestrians.

Survey says consumers losing trust in self-driving cars after crashes

The team looked at eight image recognition systems generally used in self-driving vehicles and calculated how each of them picked up skin tone, as measured on the Fitzpatrick skin type scale, a way of classifying human skin color.

The researchers concluded ‘uniformly poorer performance of those systems when detecting pedestrians with Fitzpatrick skin types between 4 and 6’, which are referred to as darker skin types. If the system fails to identify a person as a pedestrian, they are greatly at a risk of being hit since the computer will be unable to predict their behavior.

However, darker skin tone itself cannot be solely linked to bad performance. Other factors including day time or clothing color can also lead to inaccurate outcomes. But, when solely based on skin color, the accuracy rate dropped an average of 5% for pedestrians with darker skin even after controlling time of day and obstructed view, reported MIT.

Last year, tech firms including Amazon, Microsoft and IBM too were called out for using facial recognition technology that was found to be biased against people having darker skin tones.

Copyright Business Recorder, 2019

Comments

Comments are closed.