Premium

The disturbing truth behind The Capture and real life deepfakes

The Capture film still
The dystopian technology used in the plot of The Capture is not far from reality Credit:  Nick Wall

Imagine watching CCTV footage of a man kidnapping his female companion and stuffing her into his car. Would you have any reason to doubt the video is real?

That's the conundrum facing viewers of BBC One's hit surveillance thriller The Capture, which follows soldier Shaun Emery after he is accused of a crime based on damning video footage doctored by the secret services.

The Capture's plot may seem fanciful, but the technology described in the series is available today – and its potential is terrifying.  

Deepfakes have existed for years, allowing people to modify videos and photos to create footage that makes it appear as if someone is doing, or saying, things that they have made up.

Increasingly technology is allowing for more realistic “deepfakes” to be created, using deep learning or artificial intelligence to make fake videos or audio that are almost indiscernible from the real thing.

The first example of the power that these videos could yield was a viral clip depicting former US President Barack Obama. It was created using simple video editing software and a face-swapping tool called FakeApp.

The video, released by actor Jordan Peele and Buzzfeed, showed the politician seemingly calling Donald Trump “a total and complete dip****”.

“Most people assume it takes some processing time and no small amount of skill to make them [deepfakes] convincing. Not so,” says Alan Woodward, professor at the University of Surrey.

“We’ve seen not just the manipulation of video using AI models applies directly to real-time streams now, but the same with audio. 

“You’d need expert examination in court to disprove it as the usual assumption is that the camera never lies – which is no longer true.”

Seeing is no longer believing

In The Capture, the deepfake video relies on three different things: a mole on the inside (in this case, the barrister who ensures the target is in the right place at the right time), the use of an actor of similar height and build to simulate the motions of the doctored video, and the passing of a bus to insert the fake footage into the CCTV system.

This scenario with actors and face-imposition, experts claim, is far easier to recreate than trying to live-doctor the entire scenario from scratch. 

“If you have an actor that is playing that role and you are looking to do some kind of face swap or minimal edits, then you are looking at less of a procedure,” argues Henry Ajder of Deeptrace. “That could be something that actually happens.”

The fact that it is CCTV footage, which is often low resolution and grainy, could help in making the fakery less onerous. The majority of the almost 500,000 CCTV cameras in London alone, for example, still produce low resolution imagery. Many were found to still use default passwords like 123456.

We have not yet seen examples of this kind of sabotage happening in real life, but it is not out of the realm of possibility for the future, Ajder says.  

The Capture still sounds like the realm of fiction but it is drawing from technology being developed now. Those kinds of scenarios are not possible right now at least not by a common criminal – but could be by a state actor.”

As it turns out in the series, the bad actors were state-sponsored spies.

Shamir Allibhai, chief executive of video verification company Amber, believes that footage from body cameras and CCTV, which are increasingly used as evidence in court, could be a prime target for hackers looking to frame people for crimes that they did not commit or cast doubt on the guilt of others up for trial. 

“It is as easy as putting an Instagram filter on a photo or video. It is becoming available to everyone without the tech knowledge to do that,” he says. 

“I wouldn’t say unequivocally you should doubt everything yet in 2019, but we should be well aware that this tech exists and in the near future we should question and doubt all video footage and audio recordings especially if it hasn't been authenticated.”

Can we still trust video footage?

Spotting the fakes usually involves one of two things: either companies use artificial intelligence to spot the “glitches” in the videos which are common to other deepfakes, or using data collected about the video or audio when it was first recorded to then be able to tell if that has been edited.

It is unlikely that we will have fakes that are so good that systems can’t detect them on any level, Adjer argues, but audiovisual media could lose some of its power as a source of truth in the future. 

“It is naive to imply that detection will be the silver bullet,” he says. “What it can do is that it will raise the bar so that 99pc of videos will be detected and provides a piece of the puzzle for building a solution set.”

However, Allibhai has previously claimed that deepfake material, including fabricated evidence, will soon become so realistic that it could land people in jail.

It would be up to courts to use expert examination to disprove it as the usual assumption is that the camera never lies - which is no longer strictly true.

“Seeing (and hearing) is no longer believing, if recordings are not authenticated, and that will need a mindset shift,” he says.  

“This deepfake technology is advancing, it is getting faster, at a higher quality each and every single day there are bad actors with malicious intent working on this.”

The final episode of The Capture will be shown on BBC on Tuesday the 8th of October.