U.S. NewsNews Literacy

Actions

Deepfake Videos Are Getting Better, And The Pentagon Is Worried

The Defense Department is funding research into technology able to automatically detect doctored media.
Posted
and last updated

From a video of Queen Elizabeth II rapping to a clip of Tom Cruise announcing his intent to run for office, deepfake videos can be harmless fun, special effects to make anyone seem to be saying anything.

Manipulated videos can also be dangerous, a form of propaganda or misinformation, and a national threat when you can believably control the words of world leaders. 

The Defense Department has been funding work of engineers, including Siwei Lyu at the University at Buffalo, in a high-tech arms race to outsmart media manipulators.

"This line of work is like a cat and mouse game," Lyu said.

The Pentagon's goal is to build a platform able to instantly and forensically analyze videos for the subtle hallmarks of a faker's handiwork.

For example, doctored videos often rely on images found online, and those aren't likely to include pictures of someone blinking.

"Most of the time when we see someone close their eyes, it's not a great photograph," Lyu said.

Algorithms also compare slight differences in movement between a head and altered face, and how light reflects in a person's eyes.

In a real image, light will bounce back the same in each eye.

Not so for a lot of deepfakes.

Media manipulation is getting better, more believable.

Developers are working on a way to embed authentication codes in media, serving the same reassuring function as a blue check on Twitter or watermark on a $100 bill.

"If it's just video of me talking to you, nobody cares, right? That's not a big deal," said Nasir Memon, vice dean and professor at NYU Tandon School of Engineering. "But if it's a video captured by law enforcement using a body camera, we care. People's lives matter based on that video."

Even a low-tech forgery, a shallow fake, can be convincing.

Somebody simply slowed down video of House Speaker Nancy Pelosi to make it appear she was slurring her words.

It was not a deepfake but instead a cheap fake, but one still viewed millions of times.

Lyu said as with the pandemic, our own behavior can help stop the spread of falsified media. 

"Don't get swayed by a really sensational, interesting video and just watch it for a half a second," Lyu said. "We need to be very careful about what we are seeing, and you know, be vigilant."