gapp awards gallery banner

Fact from Fiction

By Howard Feldman, Head of Marketing & People at Synthesis

It’s getting more and more difficult to separate fact from fiction. A few years ago, my mother-in-law would send me things that I knew without question were false. The date was wrong, the site or source was not legitimate, or the image was so badly altered that within seconds of looking at it, I would be able to tell her to stop worrying. I could assure her with confidence that no one was going to throw eggs on her windscreen from a bridge on the highway in order to abduct her dog for ransom. Especially as she doesn’t have one.

This is now no longer that simple. I spend 3 hours a day on radio and write a number of columns a week. Over the last few months, I have found it increasingly difficult to verify information. The Tembisa 10 – the story of the alleged decuplet birth – is case in point. It was accepted as fact and reported around the world. But it was fiction and rightfully condemned by SANEF, the South African Editors Forum.

And that was a story. A tale that we believed. What about the deliberate attempt to mislead with the use of deepfakes. In order to understand this better, I asked the Intelligent Data team at Synthesis Software technologies to assist.

First up, I checked in with Marais Neethling to get the basics. ‘Deepfaking is the process of creating a realistic and believable forgery of a photograph or video, usually depicting people.’  Archana Arakkal put it this way, ‘deepfakes falls under the realm of deep learning that enables specific algorithms to create fake images and videos that human beings cannot distinguish them from authentic ones.’

In other words, if you see a video of Joe Biden standing on the White House Lawn speaking about a recent trip to Saturn, for example, you will only know it’s fake because we have not reached there yet. The likeness and the imagery will be so real that it will be virtually impossible to know that it’s fake.

But much like criminals leave DNA at a crime scene, so are clues left behind. ‘Just like AI is used to create the deepfakes, it can be used to detect the “fingerprint” of deepfake generator algorithms, left behind in the forged images,’ says Neethling.

The subject of deepfake is important not only to providers of news but also for social media platforms. A recent Eye Witness News article reported on a Facebook announcement that their new software runs deepfakes through a network to search for imperfections left during the manufacturing process, which the scientists say alter an image’s digital “fingerprint”.

‘In digital photography, fingerprints are used to identify the digital camera used to produce an image,’ the scientists said. ‘Similar to device fingerprints, image fingerprints are unique patterns left on images, that can equally be used to identify the generative model that the image came from. Our research pushes the boundaries of understanding in deepfake detection,’ they said.

Microsoft late last year unveiled software that can help spot deepfake photos or videos, adding to an arsenal of programs designed to fight the hard-to-detect images ahead of the US presidential election. The company’s Video Authenticator software analyses an image or each frame of a video, looking for evidence of manipulation that could be invisible to the naked eye.

According to Arakkal, it is important to be able to identify deepfakes due to the rise around malicious use of these technologies. ‘The malicious use of deepfakes proves a general threat to security and privacy, an example of such a case is to generate fake satellite imagery to confuse military analysts.’

She explained further, that there are several studies running concurrently that aim at tackling the challenge of detecting these deepfakes utilising AI. An example of such is utilising specific features on a video or image that can only be prevalent in deepfakes i.e. eye blinking, head pose estimation.

In other words, it seems to take one to know one. And takes one to catch one.

We haven’t heard the last of deepfakes. They will be with us for a while and will undoubtedly need a universally recognised stamp to indicate that the message is one. If not, not only will the world become an unbearably confusing place, but in no time at all I will be receiving messages from my mother-in-law, who will be thrilled that Cyril Ramaphosa took the time to send her video birthday wishes.

Previous Article
Next Article