AGL 38.02 Increased By ▲ 0.08 (0.21%)
AIRLINK 197.36 Increased By ▲ 3.45 (1.78%)
BOP 9.54 Increased By ▲ 0.22 (2.36%)
CNERGY 5.91 Increased By ▲ 0.07 (1.2%)
DCL 8.82 Increased By ▲ 0.14 (1.61%)
DFML 35.74 Decreased By ▼ -0.72 (-1.97%)
DGKC 96.86 Increased By ▲ 4.32 (4.67%)
FCCL 35.25 Increased By ▲ 1.28 (3.77%)
FFBL 88.94 Increased By ▲ 6.64 (8.07%)
FFL 13.17 Increased By ▲ 0.42 (3.29%)
HUBC 127.55 Increased By ▲ 6.94 (5.75%)
HUMNL 13.50 Decreased By ▼ -0.10 (-0.74%)
KEL 5.32 Increased By ▲ 0.10 (1.92%)
KOSM 7.00 Increased By ▲ 0.48 (7.36%)
MLCF 44.70 Increased By ▲ 2.59 (6.15%)
NBP 61.42 Increased By ▲ 1.61 (2.69%)
OGDC 214.67 Increased By ▲ 3.50 (1.66%)
PAEL 38.79 Increased By ▲ 1.21 (3.22%)
PIBTL 8.25 Increased By ▲ 0.18 (2.23%)
PPL 193.08 Increased By ▲ 2.76 (1.45%)
PRL 38.66 Increased By ▲ 0.49 (1.28%)
PTC 25.80 Increased By ▲ 2.35 (10.02%)
SEARL 103.60 Increased By ▲ 5.66 (5.78%)
TELE 8.30 Increased By ▲ 0.08 (0.97%)
TOMCL 35.00 Decreased By ▼ -0.03 (-0.09%)
TPLP 13.30 Decreased By ▼ -0.25 (-1.85%)
TREET 22.16 Decreased By ▼ -0.57 (-2.51%)
TRG 55.59 Increased By ▲ 2.72 (5.14%)
UNITY 32.97 Increased By ▲ 0.01 (0.03%)
WTL 1.60 Increased By ▲ 0.08 (5.26%)
BR100 11,727 Increased By 342.7 (3.01%)
BR30 36,377 Increased By 1165.1 (3.31%)
KSE100 109,513 Increased By 3238.2 (3.05%)
KSE30 34,513 Increased By 1160.1 (3.48%)

Microsoft is "deeply sorry" for the racist and sexist Twitter messages generated by the so-called chatbot it launched this week, a company official wrote on Friday, after the artificial intelligence program went on an embarrassing tirade. The bot, known as Tay, was designed to become "smarter" as more users interacted with it. Instead, it quickly learned to parrot a slew of anti-Semitic and other hateful invective that human Twitter users started feeding the program, forcing Microsoft Corp to shut it down on Thursday .
Following the setback, Microsoft said in a blog post it would revive Tay only if its engineers could find a way to prevent Web users from influencing the chatbot in ways that undermine the company's principles and values. "We are deeply sorry for the unintended offensive and hurtful tweets from Tay, which do not represent who we are or what we stand for, nor how we designed Tay," wrote Peter Lee, Microsoft's vice president of research. (http://blogs.microsoft.com/blog/2016/03/25/learning-tays-introduction/) Microsoft created Tay as an experiment to learn more about how artificial intelligence programs can engage with Web users in casual conversation. The project was designed to interact with and "learn" from the young generation of millennials.
Tay began its short-lived Twitter tenure on Wednesday with a handful of innocuous tweets. Then its posts took a dark turn. In one typical example, Tay tweeted: "feminism is cancer," in response to another Twitter user who had posted the same message. Lee, in the blog post, called Web users' efforts to exert a malicious influence on the chatbot "a coordinated attack by a subset of people."
"Although we had prepared for many types of abuses of the system, we had made a critical oversight for this specific attack," Lee wrote. "As a result, Tay tweeted wildly inappropriate and reprehensible words and images." Microsoft has enjoyed better success with a chatbot called XiaoIce that the company launched in China in 2014. XiaoIce is used by about 40 million people and is known for "delighting with its stories and conversations," according to Microsoft. As for Tay? Not so much. "We will remain steadfast in our efforts to learn from this and other experiences as we work toward contributing to an Internet that represents the best, not the worst, of humanity," Lee wrote.

Copyright Reuters, 2016

Comments

Comments are closed.