AGL 40.00 No Change ▼ 0.00 (0%)
AIRLINK 129.06 Decreased By ▼ -0.47 (-0.36%)
BOP 6.75 Increased By ▲ 0.07 (1.05%)
CNERGY 4.49 Decreased By ▼ -0.14 (-3.02%)
DCL 8.55 Decreased By ▼ -0.39 (-4.36%)
DFML 40.82 Decreased By ▼ -0.87 (-2.09%)
DGKC 80.96 Decreased By ▼ -2.81 (-3.35%)
FCCL 32.77 No Change ▼ 0.00 (0%)
FFBL 74.43 Decreased By ▼ -1.04 (-1.38%)
FFL 11.74 Increased By ▲ 0.27 (2.35%)
HUBC 109.58 Decreased By ▼ -0.97 (-0.88%)
HUMNL 13.75 Decreased By ▼ -0.81 (-5.56%)
KEL 5.31 Decreased By ▼ -0.08 (-1.48%)
KOSM 7.72 Decreased By ▼ -0.68 (-8.1%)
MLCF 38.60 Decreased By ▼ -1.19 (-2.99%)
NBP 63.51 Increased By ▲ 3.22 (5.34%)
OGDC 194.69 Decreased By ▼ -4.97 (-2.49%)
PAEL 25.71 Decreased By ▼ -0.94 (-3.53%)
PIBTL 7.39 Decreased By ▼ -0.27 (-3.52%)
PPL 155.45 Decreased By ▼ -2.47 (-1.56%)
PRL 25.79 Decreased By ▼ -0.94 (-3.52%)
PTC 17.50 Decreased By ▼ -0.96 (-5.2%)
SEARL 78.65 Decreased By ▼ -3.79 (-4.6%)
TELE 7.86 Decreased By ▼ -0.45 (-5.42%)
TOMCL 33.73 Decreased By ▼ -0.78 (-2.26%)
TPLP 8.40 Decreased By ▼ -0.66 (-7.28%)
TREET 16.27 Decreased By ▼ -1.20 (-6.87%)
TRG 58.22 Decreased By ▼ -3.10 (-5.06%)
UNITY 27.49 Increased By ▲ 0.06 (0.22%)
WTL 1.39 Increased By ▲ 0.01 (0.72%)
BR100 10,445 Increased By 38.5 (0.37%)
BR30 31,189 Decreased By -523.9 (-1.65%)
KSE100 97,798 Increased By 469.8 (0.48%)
KSE30 30,481 Increased By 288.3 (0.95%)

ISLAMABAD: Kaspersky (an international cybersecurity company) has warned that companies and consumers must be aware of ‘deepfake’ videos and other social engineering attacks, which will become a serious threat in future.

Hafeez Rehman, Technical group manager at Kaspersky told this scribe that the research has found the availability of ‘deepfake’ creation tools and services on darknet marketplaces. These services offer generative AI video creation for a variety of purposes, including fraud, blackmail, and stealing confidential data. According to the estimates by Kaspersky experts, prices per one minute of a ‘deepfake’ video can be purchased for as little as $300.

He stated that the widespread adoption of Artificial Intelligence (AI) and machine learning technologies in recent years is providing threat actors with sophisticated new tools to perpetrate their attacks. One of these is ‘deepfakes’ which include generated human-like speech or photo and video replicas of people. Kaspersky warned that companies and consumers must be aware that ‘deepfakes’ will likely become more of a concern in the future.

According to the recent Kaspersky Business Digitisation Survey, 51% of employees surveyed in the META region said they could tell a ‘deepfake’ from a real image, however in a test only 25% could actually distinguish a real image from an AI-generated one. This puts organisations at risk given how employees are often the primary targets of phishing and other social engineering attacks.

For example, cyber criminals can create a fake video of a CEO requesting a wire transfer or authorising a payment, which can be used to steal corporate funds. Compromising videos or images of individuals can be created, which can be used to extort money or information from them.

“Despite the technology for creating high-quality ‘deepfakes’ not being widely available yet, one of the most likely use cases that will come from this is to generate voices in real-time to impersonate someone. It’s important to remember that ‘deepfakes’ are a threat not only to businesses, but also to individual users - they spread misinformation, are used for scams, or to impersonate someone without consent – and are a growing cyber threat to be protected from,” Hafeez said.

Kaspersky recommended people and businesses to be aware of the key characteristics of ‘deepfake videos’. A solution such as Kaspersky Threat Intelligence can assist keeping information security specialists up to date on the most recent developments in the ‘deepfake’ game. Companies should also strengthen the human firewall by ensuring their employees understand what they see, Hafeez added.

Copyright Business Recorder, 2024

Comments

Comments are closed.