• TEXT= DATE: 15.10.2024 deBRIDGE  |  IDO  |  INFRASTRUCTURE
  • TIMER= 16.10.2024
  • BUTTON= FIND OUT MORE
  • bitcoinBitcoin (BTC) $ 42,977.00 0.18%
  • ethereumEthereum (ETH) $ 2,365.53 1.12%
  • tetherTether (USDT) $ 1.00 0.2%
  • bnbBNB (BNB) $ 302.66 0.19%
  • solanaSolana (SOL) $ 95.44 1.28%
  • xrpXRP (XRP) $ 0.501444 0.1%
  • usd-coinUSDC (USDC) $ 0.996294 0.34%
  • staked-etherLido Staked Ether (STETH) $ 2,367.26 1.4%
  • cardanoCardano (ADA) $ 0.481226 2.68%
  • avalanche-2Avalanche (AVAX) $ 34.37 1.19%
  • bitcoinBitcoin (BTC) $ 42,977.00 0.18%
    ethereumEthereum (ETH) $ 2,365.53 1.12%
    tetherTether (USDT) $ 1.00 0.2%
    bnbBNB (BNB) $ 302.66 0.19%
    solanaSolana (SOL) $ 95.44 1.28%
    xrpXRP (XRP) $ 0.501444 0.1%
    usd-coinUSDC (USDC) $ 0.996294 0.34%
    staked-etherLido Staked Ether (STETH) $ 2,367.26 1.4%
    cardanoCardano (ADA) $ 0.481226 2.68%
    avalanche-2Avalanche (AVAX) $ 34.37 1.19%
image-alt-13BTC Dominance: 51.25%
image-alt-14 ETH Dominance: 16.27%
image-alt-15 BTC/ETH Ratio: 13%
image-alt-16 Total Market Cap 24h: $1.65T
image-alt-17Volume 24h: $42.89B
image-alt-18 ETH Gas Price: 26 Gwei
 

MORE FROM SPONSORED

LIVE Iron News

 

ARTICLE INFORMATION

Firefly A Human Face Is Taking Off A Mask Over His Face By Hands Revealing That Actually Is A Robot,d (1)

AI deepfake attacks will extend beyond videos and audio — Security firms

AI-powered deepfake scams targeting crypto wallets are rising, with experts warning of evolving threats and...

As artificial intelligence-powered deepfake scams become more prevalent, security firms warn that the attack method could extend beyond video and audio.

On Sep 4, software firm Gen Digital reported that malicious actors using AI-powered deepfake scams to defraud crypto holders have ramped up operations in the second quarter of 2024.

The company said that the scammer group called “CryptoCore” had already scammed over $5 million in crypto using AI deepfakes.

While the amount seems low compared to other attack methods in the crypto space, security professionals believe that AI deepfake attacks can expand further, threatening the safety of digital assets.

Web3 security firm CertiK believes that AI-powered deepfake scams will become more sophisticated. A CertiK spokesperson told Cointelegraph that it could also expand beyond videos and audio recordings in the future.

The spokesperson explained that the attack vector could be used to trick wallets that use facial recognition to give hackers access:

“For instance, if a wallet relies on facial recognition to secure critical information, it must evaluate the robustness of its solution against AI-driven threats.”
The spokesperson said it’s important for crypto community members to become increasingly aware of how this attack works.

Luis Corrons, a security evangelist for cybersecurity company Norton, believes that AI-powered attacks will continue to target crypto holders. Corrons noted that crypto yields significant financial rewards and lower risks for hackers. He said:

“Cryptocurrency transactions are often high in value and can be conducted anonymously, making them a more attractive target for cybercriminals, as successful attacks yield more significant financial rewards and lower risk of detection.”
Furthermore, Corrons said that crypto still lacks regulations, giving cybercriminals fewer legal consequences and more opportunities to attack.

While AI-powered attacks may be a big threat to crypto users, security professionals believe that there are ways for users to protect themselves from this type of threat. According to a CertiK spokesperson, education would be a good place to start.

ANOTHER MUST-READ: Google lets AI depict people again after diversity-borked images in Feb

A CertiK engineer explained that knowing the threats and the tools and services available to combat them is important. In addition, the professional also added that being wary of unsolicited requests is also important. They said:

“Being skeptical of unsolicited requests for money or personal information is crucial, and enabling multifactor authentication for sensitive accounts can help add an extra layer of protection against such scams.”
Meanwhile, Corrons believes there are “red flags” that users can try to spot to avoid AI deepfake scams. This includes unnatural eye movements, facial expressions, and body movements.

Furthermore, a lack of emotion could also be a big tell. “You also can spot facial morphing or image stitches if someone’s face doesn’t seem to exhibit the emotion that should go along with what they’re supposedly saying,” Corrons explained.

Apart from these, the executive said that awkward body shapes, misalignments, and inconsistencies in the audio should help users determine whether they’re looking at an AI deepfake or not.

Please consider sharing this article

FEATURED

EVENTS

Days
Hr
Min
Sec
 

ICN TALKS EPISODES