AGL 38.14 Increased By ▲ 0.12 (0.32%)
AIRLINK 212.00 Increased By ▲ 14.64 (7.42%)
BOP 9.85 Increased By ▲ 0.31 (3.25%)
CNERGY 6.43 Increased By ▲ 0.52 (8.8%)
DCL 9.19 Increased By ▲ 0.37 (4.2%)
DFML 37.92 Increased By ▲ 2.18 (6.1%)
DGKC 100.50 Increased By ▲ 3.64 (3.76%)
FCCL 36.00 Increased By ▲ 0.75 (2.13%)
FFBL 88.94 Increased By ▲ 6.64 (8.07%)
FFL 14.49 Increased By ▲ 1.32 (10.02%)
HUBC 133.81 Increased By ▲ 6.26 (4.91%)
HUMNL 13.70 Increased By ▲ 0.20 (1.48%)
KEL 5.65 Increased By ▲ 0.33 (6.2%)
KOSM 7.22 Increased By ▲ 0.22 (3.14%)
MLCF 45.40 Increased By ▲ 0.70 (1.57%)
NBP 61.50 Increased By ▲ 0.08 (0.13%)
OGDC 231.01 Increased By ▲ 16.34 (7.61%)
PAEL 40.84 Increased By ▲ 2.05 (5.28%)
PIBTL 8.55 Increased By ▲ 0.30 (3.64%)
PPL 203.00 Increased By ▲ 9.92 (5.14%)
PRL 39.90 Increased By ▲ 1.24 (3.21%)
PTC 27.65 Increased By ▲ 1.85 (7.17%)
SEARL 108.10 Increased By ▲ 4.50 (4.34%)
TELE 8.72 Increased By ▲ 0.42 (5.06%)
TOMCL 36.21 Increased By ▲ 1.21 (3.46%)
TPLP 13.95 Increased By ▲ 0.65 (4.89%)
TREET 24.38 Increased By ▲ 2.22 (10.02%)
TRG 61.15 Increased By ▲ 5.56 (10%)
UNITY 34.49 Increased By ▲ 1.52 (4.61%)
WTL 1.72 Increased By ▲ 0.12 (7.5%)
BR100 12,207 Increased By 480.4 (4.1%)
BR30 38,088 Increased By 1711.5 (4.7%)
KSE100 113,799 Increased By 4285.6 (3.91%)
KSE30 36,006 Increased By 1492.7 (4.32%)

Popular digital assistants that reply in a woman's voice and are styled as female helpers are reinforcing sexist stereotypes, according to a United Nations report released on Wednesday. The vast majority of assistants such as Apple's Siri, Amazon Alexa and Microsoft's Cortana are designed to be seen as feminine, from their names to their voices and personalities, said the study.
They are programmed to be submissive and servile - including politely responding to insults - meaning they reinforce gender bias and normalise sexist harassment, said researchers from the UN scientific and cultural body UNESCO.
"Siri's submissiveness in the face of gender abuse - and the servility expressed by so many other digital assistants projected as young women - provides a powerful illustration of gender biases coded into technology products," it said.
Apple, Amazon and Microsoft were all not immediately available for comment.
A spokeswoman for Microsoft has previously said the company researched voice options for Cortana and found "a female voice best supports our goal of creating a digital assistant".
Voice assistants have quickly become embedded into many people's everyday lives and they now account for nearly one-fifth of all internet searches, said the report, which argued they can have a significant cultural impact.
As voice-powered technology reaches into more communities worldwide, the feminisation of digital assistants may help gender biases to take hold and spread, they added.

Copyright Reuters, 2019

Comments

Comments are closed.