Skip to Content, Navigation, or Footer.
Saturday, Sept. 21, 2024
The Observer

Lecture explores deepfakes, media manipulation

Matthew Turek, program manager of the Department of Defense’s Defense Advanced Research Projects Agency (DARPA) Information Innovation Office, discussed media manipulation and developing strategies for detecting falsified media in a lecture Friday as part of the Ten Years Hence speaker series, sponsored by the O’Brien-Smith Leadership Program. 

Turek holds a doctorate in computer science from the Rensselaer Polytechnic Institute and has contributed to many DARPA and Air Force Research Lab efforts. His primary interests include computer vision, artificial intelligence and machine learning.

“We’ve seen really rapid growth over the years in automated techniques for creating and for manipulating media,” Turek said.

In a slideshow, he illustrated this point by showing how technology to create entirely synthetic computer-generated images of human faces — indiscernible from the real thing — has improved dramatically over the past seven years. 

Turek also highlighted a media manipulation technique called “deepfaking.” This is a process where the media manipulator trains two computer programs, called neural networks, using footage of a desired person, often a celebrity or politician, who they want to “fake” and footage of an actor whose facial expressions and movements will then be mapped onto the desired person’s face. In simple terms, deepfaking can make it look as if the desired person did or said something that they, in reality, did not. 

To illustrate the danger deepfakes pose, Turek presented an example where computer scientists at Stanford were able to edit the mouth movements of the speaker in a video of a financial report to make it seem as if Apple’s stock price dropped from $191.45 to $182.25. 

“If that had been a real attack, that would reflect a $40 billion hit to Apple’s market cap,” Turek said.

Deepfakes are one of the more sophisticated forms of media manipulation Turek discussed. He said even everyday tools like Photoshop and Gimp can be used to manipulate photos and videos in an extremely convincing way.

Turek said popularity is the primary motivation for individuals or organizations to implement media manipulation techniques. 

“It creates media that’s more likely to go viral … and to get spread,” he said.

He gave an example of a verified Chinese government social media account that shared a digitally created image manipulated to show an Australian soldier putting a knife to the throat of a young Afghan girl.

“That’s created media,” Turek said. “And a Chinese government account is using it to amplify their message.”

It is now easier than ever for everyday people to create manipulated media. Creating something as sophisticated as a deepfake video, Turek said, requires nothing more than a high-end gaming computer with a graphics processing unit. 

“Things like deepfakes … are really making it easier for low-skilled individuals to potentially create compelling manipulation,” Turek said.

He added that billions of pieces of content are being uploaded to social media every day. This sheer amount of content combined with the ever-increasing ease of media manipulation techniques poses an immense challenge to media authentication, Turek said.

“[There are] thousands of software tools that in some way can create or manipulate media versus handfuls of tools that were available to help authenticate media,” Turek said. 

In response to the rise in media manipulation, DARPA has taken on many technologically complex projects to identify manipulated media.

Most notably, DARPA’s Media Forensics program invested in developing a quantitative integrity score for the authenticity of images and videos. 

“We framed our approaches in the Media Forensics program … around three levels of integrity,” Turek said, “Digital, physical and semantic integrity.”

He said digital integrity measures can identify irregularities in an image such as replicated pixels and blurred edges that indicate a portion of pixels have been copied and pasted using a media manipulation tool like Photoshop. However, Turek said that skilled media manipulators use various techniques to make digital irregularities much more difficult to detect. 

“So, you need other approaches like physical integrity,” Turek said. “For physical integrity, we’re really looking for indications that the laws of physics have been violated.”

Such physical clues include objects that do not interact with their environment in a normal way, Turek said, such as a boat not leaving a wake in the water.

Turek said another way to test semantic integrity is to use an image’s timestamp to identify inconsistencies.

“For an image that’s taken outside,” Turek explained, “If you know approximately where and when that image was taken, you can estimate the sun angle and see if it’s consistent with that image, or you can look to see if the weather is consistent with what you know about that location and time.”

Turek concluded by emphasizing DARPA’s main objective.

In the recent past, he explained, only those with access to Hollywood-grade equipment could manipulate media, but with so many ever-improving technologies, the bar to entry in the realm of convincing media manipulation has been lowered to include anyone with a computer and access to the Internet.

“What we’re hoping to do with these defensive technologies is essentially raise that bar back up,” Turek said. “We want to take those sort of easy capabilities off the table again.”