AGL 38.99 Increased By ▲ 0.97 (2.55%)
AIRLINK 207.10 Increased By ▲ 9.74 (4.94%)
BOP 9.68 Increased By ▲ 0.14 (1.47%)
CNERGY 6.04 Increased By ▲ 0.13 (2.2%)
DCL 8.93 Increased By ▲ 0.11 (1.25%)
DFML 37.06 Increased By ▲ 1.32 (3.69%)
DGKC 96.99 Increased By ▲ 0.13 (0.13%)
FCCL 35.64 Increased By ▲ 0.39 (1.11%)
FFBL 88.94 Increased By ▲ 6.64 (8.07%)
FFL 13.47 Increased By ▲ 0.30 (2.28%)
HUBC 128.00 Increased By ▲ 0.45 (0.35%)
HUMNL 13.84 Increased By ▲ 0.34 (2.52%)
KEL 5.41 Increased By ▲ 0.09 (1.69%)
KOSM 7.05 Increased By ▲ 0.05 (0.71%)
MLCF 44.94 Increased By ▲ 0.24 (0.54%)
NBP 60.83 Decreased By ▼ -0.59 (-0.96%)
OGDC 217.49 Increased By ▲ 2.82 (1.31%)
PAEL 40.79 Increased By ▲ 2.00 (5.16%)
PIBTL 8.40 Increased By ▲ 0.15 (1.82%)
PPL 194.87 Increased By ▲ 1.79 (0.93%)
PRL 39.17 Increased By ▲ 0.51 (1.32%)
PTC 26.70 Increased By ▲ 0.90 (3.49%)
SEARL 107.62 Increased By ▲ 4.02 (3.88%)
TELE 8.51 Increased By ▲ 0.21 (2.53%)
TOMCL 35.67 Increased By ▲ 0.67 (1.91%)
TPLP 13.52 Increased By ▲ 0.22 (1.65%)
TREET 23.15 Increased By ▲ 0.99 (4.47%)
TRG 61.15 Increased By ▲ 5.56 (10%)
UNITY 33.00 Increased By ▲ 0.03 (0.09%)
WTL 1.68 Increased By ▲ 0.08 (5%)
BR100 11,931 Increased By 204.6 (1.75%)
BR30 36,882 Increased By 505.5 (1.39%)
KSE100 111,929 Increased By 2416 (2.21%)
KSE30 35,233 Increased By 719.8 (2.09%)
Technology

Study finds self-driving cars less likely to detect dark-skinned people

A recent study has found self-driving cars to be racist too and tend to have trouble detecting darker skin tones as
Published March 5, 2019

A recent study has found self-driving cars to be racist too and tend to have trouble detecting darker skin tones as compared to white ones.

Researchers from Georgia Institute of Technology recently analyzed some data and discovered that systems used by autonomous vehicles through which they detect pedestrians had trouble picking out people that had darker skin tones.

According to Mashable, the researchers looked at footage from the Berkeley Driving Dataset that contained footage from New York, San Francisco, Berkeley and San Jose. Through the videos, the team was able to look at how systems would react to various types of pedestrians.

Survey says consumers losing trust in self-driving cars after crashes

The team looked at eight image recognition systems generally used in self-driving vehicles and calculated how each of them picked up skin tone, as measured on the Fitzpatrick skin type scale, a way of classifying human skin color.

The researchers concluded ‘uniformly poorer performance of those systems when detecting pedestrians with Fitzpatrick skin types between 4 and 6’, which are referred to as darker skin types. If the system fails to identify a person as a pedestrian, they are greatly at a risk of being hit since the computer will be unable to predict their behavior.

However, darker skin tone itself cannot be solely linked to bad performance. Other factors including day time or clothing color can also lead to inaccurate outcomes. But, when solely based on skin color, the accuracy rate dropped an average of 5% for pedestrians with darker skin even after controlling time of day and obstructed view, reported MIT.

Last year, tech firms including Amazon, Microsoft and IBM too were called out for using facial recognition technology that was found to be biased against people having darker skin tones.

Copyright Business Recorder, 2019

Comments

Comments are closed.