Lockheed Martin, Google in DARPA Team for Automatic Identification of Falsified Media

  • Our Bureau
  • 07:35 AM, March 3, 2021
  • 4324
Lockheed Martin, Google in DARPA Team for Automatic Identification of Falsified Media
SemaFor DARPA project to detect falsified media

DARPA on Monday announced four research teams selected for the Semantic Forensics (SemaFor) program to develop technologies capable of automating the detection of falsified media assets.

Four teams of researchers will focus on developing three specific types of algorithms: semantic detection, attribution, and characterization algorithms. These will help analysts understand the “what,” “who,” “why,” and “how” behind the manipulations as they filter and prioritize media for review, a U.S. Defence Advanced Research Projects Agency (DARPA) release said.

The teams will be led by Kitware, Inc., Purdue University, SRI International, and the University of California, Berkeley. The semantic detection algorithms will seek to determine whether a media asset has been generated or manipulated. Attribution algorithms will aim to automate the analysis of whether media comes from where it claims to originate, and characterization algorithms seek to uncover the intent behind the content’s falsification.

Lockheed Martin - Advanced Technology Laboratories will lead the research team selected to take on the development of technologies for automatically assembling and curating the evidence provided by the detection, attribution, and characterization algorithms and will develop a prototype SemaFor system.

Google/Carahsoft will provide perspective on disinformation threats to large-scale internet platforms, while NVIDIA will provide media generation algorithms and insights into the potential impact of upcoming hardware acceleration technologies.

Lockheed Martin, Google in DARPA Team for Automatic Identification of Falsified Media

New York University provides a link to the NYC Media Lab and a broad media ecosystem that will provide insights into the evolving media landscape, and how it could be exploited by malicious manipulators. In addition, Accenture Federal Services AFS provides evaluation, connectivity, and operational viability assessment of SemaFor in application to the Department of State's Global Engagement Center, which has taken the lead on combating overseas disinformation.

Finally, ensuring the tools and algorithms in development have ample and relevant training data, researchers from PAR Government Systems have been selected to lead data curation and evaluation efforts on the program. The PAR team will be responsible for carrying out regular, large scale evaluations that will measure the performance of the capabilities developed on the program.

 

 

 

Media manipulation capabilities are advancing at a rapid pace while also becoming increasingly accessible to everyone from the at-home photo editor to nation state actors. As the technology evolves so does the national security threat posed by compelling media manipulations

 “From a defense standpoint, SemaFor is focused on exploiting a critical weakness in automated media generators,” said Dr. Matt Turek, the DARPA program manager leading SemaFor. “Currently, it is very difficult for an automated generation algorithm to get all of the semantics correct. Ensuring everything aligns from the text of a news story, to the accompanying image, to the elements within the image itself is a very tall order. Through this program we aim to explore the failure modes where current techniques for synthesizing media break down.”

Also Read

Now, a US Research Project to Detect Fake Media Products

September 11, 2019 @ 11:00 AM
FEATURES/INTERVIEWS