How deep fake technology makes phony videos look real: Internet Scambusters #829
As the name implies, "deep fake" videos are phony online videos that are so good most people would be taken in by them.
This is a looming threat to every one of us who uses the Internet as a source of information - about celebrities, politicians, and even friends and families.
In this week's issue, we'll explain the shocking implications of this technological scam and give some clues that may help you spot a deep fake video.
Let's get started...
Deep Fake Videos Threaten Turmoil for all Users
Worries about the way computer geeks can secretly change what we see and hear online have brought to the surface a new and potentially devastating scam, known as deep fake, or deepfake (all one word).
It's a form of advanced artificial intelligence that enables those who use it to create realistic-looking videos of political and celebrity personalities -- or anyone they care to, for that matter -- saying or doing outrageous things.
It's so realistic that only an expert can readily detect the fraudulent videos, meaning that very soon the rest of us won't be able to tell whether what we're watching is genuine or not. That leaves us open to all manner of con tricks.
In very simple terms, the deepfake technology enables a person to say something and for their words, lip movements, and facial expressions to be instantly transferred onto an animated image of someone else's face.
In some cases, the software will also change the voice of the speaker (referred to as the "source") into an exact replica of the voice of the other person (known as the "target").
In another deep fake variation, the face of one person can undetectably be superimposed onto the body of another. This trick has been used for a widespread, though harmless, series of fakes in which the face of actor Nicholas Cage has been superimposed on many characters from other movies.
Although the explanation is somewhat technical, you can see a video of how one person's expressions can be transferred to another.
In a recent report carried by news channel CNN, Bobby Chesney, a Texas law professor and deepfake researcher said: "The opportunity for malicious liars is going to grow by leaps and bounds."
Destroying Reputations
In September, Fortune magazine painted a doomsday scenario, declaring: "The fear is that deep fakes could unduly destroy reputations and even set off unrest."
Imagine, the magazine said, falsified videos depicting a presidential candidate misbehaving, a police chief inciting violence, or soldiers committing war crimes.
"High-profile individuals such as politicians and business leaders are especially at risk, given how many recordings of them are in the public domain."
Social media networks like Facebook and Twitter, which are where many of the deep fake videos turn up, are trying to tackle the problem -- but it's tough for them to keep up with the onslaught because of the need to review every aspect of a video to try to identify if it's genuine.
And, unfortunately, experts are now saying that within the next few months even they won't be able to tell the real thing from a fake, although the U.S. government's Defense Advanced Research Projects Agency (DARPA) is partway through developing technology to identify doctored videos and images.
For example, advanced software can detect a person's pulse rate in a video. In fakes, the pulse showing in a target's faked head can be significantly different to the pulse elsewhere in their real body.
Seven Tips
Until a solution arrives, it's up to us to do our best to spot other tell-tale signs. Here are a few tips to follow:
1. If the video shows someone saying or doing something that seems outrageous and out-of-character -- swearing for example -- that might indicate a deep fake.
For instance, a fake video of President Barack Obama, which has been viewed on YouTube almost 5 million times, has the ex-President seeming to use obscenities, which he hasn't been known to do in public.
2. Check the blinks. We all blink a lot more than we realize and if the person in the image doesn't blink, or they don't blink for at least 30 seconds, that could suggest a fake video.
3. Look too for unusual, jerky movements of the person's head.
4. Sometimes, the voice seems to go out of synchronization with the lip movements of the individual. Then they get back in sync. This wouldn't happen with a regular video where the entire video would either be in or out of sync.
5. Is it a short-short? Deep fakes are generally a few seconds or minutes long at the most because of the complexity of creating them
6. At the very least, be skeptical and try to confirm controversial statements or other behaviors you see by checking elsewhere on the Internet.
7. Check who is "hosting" the video. If it's on a respectable news site, it's less likely to be a fake. But make sure you haven't been tricked into visiting a fake lookalike site!
The worry however goes much deeper. Deepfakes threaten to undermine public trust in all news sources.
As CNN said: "... it's not just about individual videos that will spread misinformation: it's also the possibility that videos like these will convince people that they simply can't trust anything they read, hear or see unless it supports the opinions they already hold."
Sadly, programs and apps that enable any user to create deep fakes are now available to the general public.
Though not as sophisticated as the more advanced programs used by crooks and experts, they could still be used to create malicious, compromising videos of individuals you know.
So make sure you apply the deepfake-spotting tips mentioned here and adopt healthy skepticism of things that are calculated to anger or upset you... until you know the real truth.
Alert of the Week
A couple of weeks back, we wrote about new credit freeze rules that have come into force enabling consumers to take easier and faster action to halt access to their record with the credit reporting agencies.
The change apparently has given rise to a number of questions, which Federal Trade Commission (FTC) attorney Lisa Weintraub Schifferle answers.
That's it for today -- we hope you enjoy your week!
Leave a Reply