Do you have $25m to spare? Deepfake frauds are here

Do you have $25m to spare? Deepfake frauds are here

How Identity-based encryption can help

Yet another finance worker in a multinational company was recently duped into paying out $25 million after a video call with a deepfake chief financial officer. https://edition.cnn.com/2024/02/04/asia/deepfake-cfo-scam-hong-kong-intl-hnk/index.html  Not only was the CFO on the call a deepfake, so were all the other participants, all of whom were known to the finance worker. While initially the worker was suspicious, they put aside their doubts after the video call because it was so convincing.  This is not an isolated case, though previous frauds have tended to rely only on audio deepfakes.

With the growth of Artificial Intelligence (AI), impersonation-based attacks using deepfakes will continue to become more prevalent and even more believable. This is reinforced by the latest assessment from the National Cyber Security Centre (NCSC) and the National Crime Agency (NCA) https://www.ncsc.gov.uk/news/global-ransomware-threat-expected-to-rise-with-ai   which reports that the growth and accessibility of AI will rapidly increase the number and believability of ransomware and other attacks. As AI gathers momentum so the barrier to entry is lowered meaning that relatively unskilled threat actors such as novice cyber criminals, hackers-for-hire and hacktivists are able to carry out more effective attacks.

All this begs the question, what can organisations do to protect themselves from what is fast becoming a ‘wild west’ situation?

Tackling Deepfakes and other Impersonation-based attacks

Eventually people will become better able to spot deepfakes, in the same way that most of us don’t believe every photo we see, knowing that it is all too easy to manipulate images using software such as Photoshop. However, there is an immediate need for organisations to do everything they can to protect themselves and their employees from becoming victims of this newest threat.

Increasingly, authenticating the source of news, content, and all manner of communications is critical. Being able to trust that you are communicating with the genuine person (and not an impostor) will be a key to safety online, and for any type of transaction, whether that is taking financial or legal instructions from colleagues or customers, sharing commercially sensitive information with third-parties in the supply chain, or discussing matters of state with trusted advisors and co-workers.

As NCSC CEO Lindy Cameron states in the report, “The emergent use of AI in cyber attacks is evolutionary not revolutionary, meaning that it enhances existing threats like ransomware but does not transform the risk landscape in the near term.”

Identity-based Encryption will help to mitigate the risk

Technology is already available to protect sensitive business communications via voice, instant messaging and video conferencing. Secure communication solutions that use identity-based encryption, such as the NCSC’s MIKEY-SAKKE protocol https://www.ncsc.gov.uk/information/the-development-of-mikey-sakke, help organisations to verify that only approved participants can join a call group, meaning that everyone on a video conference call (for example) has been authenticated. This type of security feature is NOT provided by mass-adoption communication platforms, where very often all that it needed is a mobile phone number or email address to set up an account, and those are very easily spoofed, hacked or compromised (e.g. by SIM-swapping).

The Armour Secure Communications platform is purpose-built, Secure by Design, to protect sensitive communications between trusted colleagues, and can be used at higher assurance levels.

Lindy Cameron goes on to say, “As the NCSC does all it can to ensure AI systems are Secure by Design, we urge organisations and individuals to follow our ransomware and cyber security hygiene advice to strengthen their defences and boost their resilience to cyber attacks.”

For more information about the NCSC’s 7 Principles of Secure Communication and how Armour meets them all read our White paper: Replacing WhatsApp? Advice from NCSC

  • Do you have $25m to spare? Deepfake frauds are here
  • Do you have $25m to spare? Deepfake frauds are here
  • Do you have $25m to spare? Deepfake frauds are here
  • Do you have $25m to spare? Deepfake frauds are here
  • Do you have $25m to spare? Deepfake frauds are here
  • Do you have $25m to spare? Deepfake frauds are here
  • Do you have $25m to spare? Deepfake frauds are here