Deepfake: Everything You Need To Know To Protect Yourself

Share This Post

Images and videos created by AI (Artificial Intelligence) are becoming more convincing by the day. In fact, more often than not they’re mistaken for genuine human content. 

When it comes to cyber security, this is a worry. 

Considering that 66% of cyber security professionals in the Global Incident Response Threat Report reported to have seen deepfakes used in cyber attacks, it’s clear that the rise in this form of AI is a real threat. 

Deepfake enables cyber criminals to convincingly impersonate celebrities or trusted people to fool organisations into giving away personal information or clicking on dangerous links/downloads to steal data. It’s a relatively new type of cyber crime, and one that is increasing incredibly quickly.

Businesses need to be aware of this new scam in order to avoid falling victim to it. 

What is deepfake? 

Deepfakes use machine learning techniques to craft remarkably lifelike and fabricated content.

In a deepfake video, the face or body of a person is altered so that they appear to be someone else. AI will superimpose someone’s face or voice onto existing audio or video recordings, to make it appear to be someone it is not. They are incredibly accurate and often look completely legitimate.

Recently, deepfakes have typically been used maliciously or to spread false information, which can be classed as fraud, identity theft or defamation.

Spotting it can be near to impossible, which is why learning about what deepfake is and how to best protect yourself against it is key, especially as the technology develops at a faster rate than the security industry is keeping up with.

How does it work? 

Making a deepfake video is easier than you may imagine. In fact, the creator only needs some basic graphic design knowledge, as the AI does all the hard work. What seems like a complicated and specialist type of cyber crime is actually very simple for a criminal to achieve.

The AI network used by the cyber criminal is ‘trained’ to recognise detailed traits of the person they’re impersonating, meaning it can make a realistic prediction of exactly how that individual will behave and appear.

All a cyber criminal needs in order to create hyper realistic deepfakes is access to videos and audio recordings of them. This is incredibly easy to get hold of with social media at our fingertips, especially high profile individuals. People with a consistent online presence offer an abundance of material and fuel to criminals, which has enabled the rise in cyber crime involving videos and voice clips of celebrities.

Types of deepfake scams 

Impersonating celebrities 

From the royal family and reality TV stars to radio hosts and politicians, deepfakes have been used to create fabricated videos for a wide range of public figures. These videos are then used to deceive victims into transferring funds or disclosing sensitive information.

Radio host and DJ Chris Moyles has recently spoken about how his voice has been manipulated and replicated to tell people that they have won a competition. He spoke on the radio about how unless the messages are from an ‘official’ account, these voice notes were not from him. Instead, they are AI. If a person were to believe the voice was Chris’, they would likely be asked to share their bank information or to click on a link that contains vicious malware. 

Damage of reputation  

Deepfake videos have been used to create false videos of employees participating in illegal or unprofessional activities, which are then spread around with the aim of damaging the reputation of your business. 

This can lead to potential loss of customers and finances. 

Business fraud 

By using deepfake videos and audio to deceive customers and employees, cyber criminals can gain unauthorised access to sensitive information, which can then be used to their advantage. 

In one instance, criminals utilised AI to clone the voice of the director of a bank, using the cloned material to successfully carry out a $35 million heist. The AI voice was so convincing that the bank manager did not suspect any criminal activity when requested to transfer the money. 

Social engineering 

As a technique used on individuals to convince them to reveal confidential information, social engineering is also a popular choice with deepfake videos.

In July 2023, a video circulated social media of financial services influencer Martin Lewis encouraging people to invest in an app. In this video, Martin claimed that the app was related to Elon Musk and was worth contributing towards. However, this video was deepfake, and Martin Lewis had nothing to do with it.

How to protect yourself from deepfake scams 

One of the best ways to protect yourself from falling victim to a deepfake scam is by learning how to recognise the signs of deepfake video and audio. 

Here are a few key things to look out for when trying to determine if a video is legitimate or fake:

  • Unnatural eye movements 
  • Video and audio that is not in sync 
  • Inconsistencies in colours and shadows as the person moves
  • Any other unnatural body movements

Hiring a reliable cyber security professional with the ability to run deepfake detection solutions or train staff on the signs of deepfake videos is also good practice. 

Be prepared with SupPortal 

At SupPortal, we work with businesses to prioritise their cyber safety.

To read more about our thoughts on AI, read our blog ‘The Advantages and Disadvantages of AI’. 

Whether it’s a security breach from cyber criminals, viruses, malware or even an accidental employee breach, we can help. We provide a range of CSaaS solutions including Managed Cyber Security subscriptions, Cyber Security Assessments, Cyber Security Awareness Training, Cyber Incident Response and Disaster Recovery and by helping achieve Cyber Essentials certification.

If you’d like to talk further about the importance of understanding the threats of AI, chat to us at 02380982218. 

More To Explore

Do You Want To Boost Your Cyber Security?

drop us a line and keep in touch

Request a Free Consultation And Estimate

DEFEND & PRotect Terms & Conditions

Subscription pricing subject to annual commitment, billed monthly by direct debit.

* Minimum of 10 user subscription, pricing will vary over and above, price shown is per user per year
** Link your own policies to specific video content to reinforce employee understanding of your policies and processes.