Video Verification and Deepfakes



Deepfakes Video VerificationTechnology has introduced greater convenience for consumers around the world. With each new technological advancement, we have benefited from better, faster, and more accurate interactions.

Anyone over the age of 35 will likely remember a time before smartphones, internet banking, or one-click shopping. But each of these have been underpinned by reliable technology. 

The pandemic of 2020 has forced many organisations to adopt remote working policies for their employees. But it has also made organisations reconsider how they interact with their customers who traditionally have walked into an office or branch to verify their identity. 

Video Verification

Enter remote and video verification techniques, which are being adopted around the world, particularly in finance where KYC (know your customer) checks are essential. Video verification allows for customers to remotely verify their identity from the comfort of their own home, usually via a mobile app. 

Depending on the organisation's requirements, this can be a video interview handled by a live operator, or it could be completely automated, with the customer taking photos of their ID, themselves, and recording a video clip of themselves to be submitted as proof. 

Video verification technologies are not just restricted to financial institutions wanting to carry out KYC checks, but other industry verticals are also considering implementing it to enhance their ability to remotely interact with customers and partners in a secure manner. 

However, each new technological innovation brings with itself risks. Enter, deepfakes. 

Deepfakes

Deepfakes can be broadly described as fabricated media created through AI and/or deep learning methods.

The chart below shows examples of what could be considered a deepfake. Anything below the green line is not really a deepfake (in accordance with our definition) because it is largely created through manual processes and not by AI or neural networks. As you move along the x axis, the sophistication increases as we move from static media such as photos to dynamic, complete videos. 

 Deepfake Chart

Sites like thispersondoesnotexist.com give very realistic images that are purely created by the AI. There may be some glitches in some areas with extra fingers or hair being out of place etc., but overall, they are good enough to fool most people during the first viewing, especially on smaller images (e.g., social media profile pics).

        Deepfake ExampleDeepfake ExampleDeepfake ExampleDeepfake Example

       Images source: thispersondoesnotexist.com 

Video Deepfakes

Of all the deepfake media, videos are perhaps the most interesting and worrying. 

Face swapping or puppeting is where the deepfake AI maps the face of the source images and generates a 3D face model based on the photos it is fed. The model maps out the features and then when fed a source video, it will map it over.

Some of the most famous earlier examples of these were President Obama's deepfakes where the University of Washington’s graphics and imaging laboratory were able to use an audio clip to synthesize a high quality video of him speaking with accurate lip sync, composited into a target video clip.  

 President Deepfake Example

Source: http://grail.cs.washington.edu/projects/AudioToObama/ 

There are many other examples available whereby famous actors have been superimposed over others in movie scenes

Risks of Deepfakes

Like most new technologies, deepfakes come with their own risks. Some early iterations of the technology were used to digitally remove clothing from women, and others have been circulating to spread disinformation or fake news, while there have been reports of criminals using deepfake audio technology to fool an organisation into sending money to a bank account controlled by fraudsters. 

One of the emerging concerns is whether deepfake videos can or will be used to bypass video verification. As of today, there doesn’t appear to be any cases of deepfakes being successful in bypassing video verification systems, but that doesn’t mean it isn’t possible, or won’t be possible in the near future. 

For the most part, bypassing video verification systems is a multi-step process, with video being only one part of it. There are usually documents, photos, and other checks that need to be successfully completed prior to reaching the video stage. Secondly, video verification is still a growing market, which makes it unattractive for most criminals at the present time. 

Like any biometric security system, video verification will have false acceptances and false rejections. It will be up to organisations how they want to tune their systems depending on the risk that it presents. 

This in itself creates a position whereby criminals can attempt DDoS attacks by flooding video verification systems with deepfake videos they know will fail, but overload the system.  

Technical Defences

Broadly speaking, there are two ways to deal with the challenge of verifying videos and photos. The first is to look for modifications in an image or video. Forensic techniques are used to pick out whether any pixels or metadata seem altered. They can look for shadows or reflections that do not follow the laws of physics, for example, or check how many times a file has been compressed to determine whether it has been saved multiple times.

The second method is to verify an image’s integrity the moment it is taken. This involves performing dozens of checks to make sure the recording device’s location data and time stamp is not being spoofed. Do the camera’s coordinates, time zone, and altitude and nearby Wi-Fi networks all corroborate each other? Does the light in the image refract as it would for a three-dimensional scene? Or is someone taking a picture of another two-dimensional photo?

There is an ever-growing number of organisations developing technologies to automate and streamline the process of validating videos that are submitted or streamed for verification. 

The Menlo Park-based nonprofit research group SRI International has been working on developing tools capable of identifying when photos and videos have been meaningfully altered from their original state after being awarded three contracts by the Pentagon’s Defense Advanced Research Projects Agency (DARPA).

More recently, Microsoft launched their own tool, Microsoft Video Authenticator, which also provides the tech for Reality Defender to validate the authenticity of videos. 

Human Defence

Human intervention will still be required to address claims of false rejections, or where the confidence in automated detection controls is low. So, training of staff is vital to help them understand what characteristics are common in deepfakes, how to spot them, and how to respond to them. 

This is particularly important where additional videos may be sent or streamed for additional information and it is where criminals may use psychological lures to manipulate human operators.  

In conclusion, there is little evidence to suggest that deepfakes are currently being used successfully to bypass video verification checks. But that does not mean it will not be possible in the future. However, at the same time, defensive techniques continue to evolve, so the onus will be on organisations to assess the risk, implement the right level of controls, and ensure staff are trained up appropriately with new-school security awareness training.


Request A Demo: Security Awareness Training

products-KB4SAT6-2-1New-school Security Awareness Training is critical to enabling you and your IT staff to connect with users and help them make the right security decisions all of the time. This isn't a one and done deal, continuous training and simulated phishing are both needed to mobilize users as your last line of defense. Request your one-on-one demo of KnowBe4's security awareness training and simulated phishing platform and see how easy it can be!

Request a Demo!

PS: Don't like to click on redirected buttons? Cut & Paste this link in your browser:

https://www.knowbe4.com/kmsat-security-awareness-training-demo

Topics: Deepfake



Subscribe to Our Blog


Comprehensive Anti-Phishing Guide




Get the latest about social engineering

Subscribe to CyberheistNews