AGL 38.48 Decreased By ▼ -0.08 (-0.21%)
AIRLINK 203.02 Decreased By ▼ -4.75 (-2.29%)
BOP 10.17 Increased By ▲ 0.11 (1.09%)
CNERGY 6.54 Decreased By ▼ -0.54 (-7.63%)
DCL 9.58 Decreased By ▼ -0.41 (-4.1%)
DFML 40.02 Decreased By ▼ -1.12 (-2.72%)
DGKC 98.08 Decreased By ▼ -5.38 (-5.2%)
FCCL 34.96 Decreased By ▼ -1.39 (-3.82%)
FFBL 86.43 Decreased By ▼ -5.16 (-5.63%)
FFL 13.90 Decreased By ▼ -0.70 (-4.79%)
HUBC 131.57 Decreased By ▼ -7.86 (-5.64%)
HUMNL 14.02 Decreased By ▼ -0.08 (-0.57%)
KEL 5.61 Decreased By ▼ -0.36 (-6.03%)
KOSM 7.27 Decreased By ▼ -0.59 (-7.51%)
MLCF 45.59 Decreased By ▼ -1.69 (-3.57%)
NBP 66.38 Decreased By ▼ -7.38 (-10.01%)
OGDC 220.76 Decreased By ▼ -1.90 (-0.85%)
PAEL 38.48 Increased By ▲ 0.37 (0.97%)
PIBTL 8.91 Decreased By ▼ -0.36 (-3.88%)
PPL 197.88 Decreased By ▼ -7.97 (-3.87%)
PRL 39.03 Decreased By ▼ -0.82 (-2.06%)
PTC 25.47 Decreased By ▼ -1.15 (-4.32%)
SEARL 103.05 Decreased By ▼ -7.19 (-6.52%)
TELE 9.02 Decreased By ▼ -0.21 (-2.28%)
TOMCL 36.41 Decreased By ▼ -1.80 (-4.71%)
TPLP 13.75 Decreased By ▼ -0.02 (-0.15%)
TREET 25.12 Decreased By ▼ -1.33 (-5.03%)
TRG 58.04 Decreased By ▼ -2.50 (-4.13%)
UNITY 33.67 Decreased By ▼ -0.47 (-1.38%)
WTL 1.71 Decreased By ▼ -0.17 (-9.04%)
BR100 11,890 Decreased By -408.8 (-3.32%)
BR30 37,357 Decreased By -1520.9 (-3.91%)
KSE100 111,070 Decreased By -3790.4 (-3.3%)
KSE30 34,909 Decreased By -1287 (-3.56%)

Apple’s announcement that it would scan encrypted messages for evidence of child sexual abuse has revived debate on online encryption and privacy, raising fears the same technology could be used for government surveillance.

The iPhone maker said its initiative would “help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of child sexual abuse material.”

The move represents a major shift for Apple, which has until recently resisted efforts to weaken its encryption that prevents third parties from seeing private messages.

Apple argued in a technical paper that the technology developed by cryptographic experts “is secure, and is expressly designed to preserve user privacy.”

The company said it will have limited access to the violating images which would be flagged to the National Center for Missing and Exploited Children, a nonprofit organization.

Nonetheless, encryption and private specialists warned the tool could be exploited for other purposes, potentially opening a door to mass surveillance.

“This sort of tool can be a boon for finding child pornography in people’s phones. But imagine what it could do in the hands of an authoritarian government?” said a tweet from Matthew Green, a cryptographer at Johns Hopkins University.

Others warned that the move could be a first step toward weakening encryption and opening “back doors” which could be exploited by hackers or governments.

“There’s going to be enormous pressure on Apple from governments around the world to expand this capability to detect other kinds of ‘bad’ content, and significant interest by attackers across the spectrum in finding ways to exploit it,” tweeted Matt Blaze, a Georgetown University computer scientist and cryptography researcher.

Blaze said the implementation is “potentially very risky” because Apple has moved from scanning data on services to the phone itself and “has potential access to all your local data.”

The new image-monitoring feature is part of a series of tools heading to Apple mobile devices, according to the company.

Apple’s texting app, Messages, will use machine learning to recognize and warn children and their parents when receiving or sending sexually explicit photos, the company said in the statement.

“When receiving this type of content, the photo will be blurred and the child will be warned,” Apple said.

“Apple’s expanded protection for children is a game changer,” said John Clark, president of the nonprofit NCMEC.

The move comes following years of standoffs involving technology firms and law enforcement.

Apple notably resisted a legal effort to weaken iPhone encryption to allow authorities to read messages from a suspect in a 2015 bombing in San Bernardino, California.

FBI officials have warned that so-called “end to end encryption,” where only the user and recipient can read messages, can protect criminals, terrorists and pornographers even when authorities have a legal warrant for an investigation.

Facebook, which has faced criticism that its encrypted messaging app facilitates crime, has been studying the use of artificial intelligence to analyze the content of messages without decrypting them, according to a recent report by The Information.

But WhatsApp head Will Cathcart said the popular messaging app would not follow Apple’s approach.

“I think this is the wrong approach and a setback for people’s privacy all over the world,” Cathcart tweeted.

Apple’s system “can scan all the private photos on your phone — even photos you haven’t shared with anyone. That’s not privacy,” he said.

“People have asked if we’ll adopt this system for WhatsApp. The answer is no.” Backers of encryption argue that authorities already have multiple sources of “digital breadcrumbs” to track nefarious activity and that any tools to break encryption could be exploited by bad actors. James Lewis, who heads technology and public policy at the Center for Strategic and International Studies, said Apple’s latest move appears to be a positive step, noting that the company is identifying offending material while avoiding directly turning over data to law enforcement. But he said it’s unlikely to satisfy the concerns of security agencies investigating extremism and other crimes. “Apple has done a good job of balancing public safety and privacy but it’s not enough for some of the harder security problems,” Lewis said.—AFP

Comments

Comments are closed.