Deepfakes

Coalition of tech and media giants such as Adobe, BBC, Intel, Microsoft, releases new standard to combat deepfakes

A new standard finally aims to protect content creators and editors against deepfakes. A coalition of tech companies has released the first version of technical specifications for digital provenance. The end-to-end standard announced by the Coalition for Content Provenance and Authenticity (C2PA), will act as a deterrent to deepfakes and other misinformation.

C2PA finds a way to tackle deepfakes

The Coalition for Content Provenance and Authenticity (C2PA), which counts Adobe, Arm, Microsoft, Intel, TruePic and the BBC among its members, has a clever solution to the growing problem of deepfakes that has affected everyone from Tom Cruise and  Barack Obama to Mark Zuckerberg and US House Speaker Nancy Pelosi.

The specification allows content creators and editors to selectively disclose information about who has created or changed the digital content. There is also a methodology to specify how the content has been altered. The C2PA says platforms will be able to specify information associated with each type of asset. They will be able to associate assets with images, videos, audio, or documents. In addition, the platforms will also be able to define how the information is presented and stored, and help with how evidence of tampering can be identified.

Andrew Jenks, C2PA Chair and Principal Program Manager, Microsoft, says, “Today marks a watershed moment in trusted media online. The C2PA specification delivers an interoperable standard for media provenance across diverse digital ecosystems, providers, and markets. It enables technical solutions to help address problems of digital trust that challenge each of us on a daily basis.”

“We have long believed that secure media provenance is the best way to relay high-integrity, authentic digital content online. An open standard in which any platform, website, app, or organisation can ingest, preserve, and publish that content to consumers will be critical to achieving trust at internet scale. At a time of unprecedented fraud & visual disinformation, we are honoured to deploy the first C2PA-compliant mobile camera,” says Jeff McGregor, CEO of Truepic.

Deepfakes on the rise

Deepfakes are on the rise and humans are having a hard time distinguishing the fake from the real. Last year, NVIDIA CEO Jensen Huang delivered a keynote standing in front of his kitchen. While there were some irregularities in the presentation, NVIDIA later confirmed that part of the keynote was presented by a computer-generated image of Huang.

The company behind some of the most powerful graphics processors confirmed that the kitchen, the CEO and even his leather jacket were entirely computer-generated. Huang has presented so many events virtually that even those with eagle eyes could not distinguish the real version of Huang from his CGI avatar. The presentation meant no harm but there have been others with malicious intent.

In 2018, a deepfake of former US President Barack Obama did the rounds where he was seen discussing fake news and criticised then President Donald Trump. The video of Nancy Pelosi, the speaker of the US House of Representatives, may not qualify as a deepfake but the video, slowed down by 25 per cent and altered pitch, showed the early starting place for those with malicious intent.

When VFX artist Chris Ume showed deepfake videos of Tom Cruise, nearly eight out of ten failed to spot them as fake. This inability to distinguish between real and fake despite being told that the videos might be fake showed the real harm that deepfakes could cause and led to the heightened conversation around combating deepfakes.

Facebook even partnered with Michigan State University to create a reverse-engineering research method to detect and attribute deepfakes. The system, according to Forbes, uses fingerprint estimation to predict the network architecture. Now, the solution offered by the C2PA specification shows the industry can work together to resolve the harm that deepfakes could cause to society.

However, it is not clear whether content creators will take up the suggestions offered by the C2PA. Implementation will remain a challenge and it is not immediately clear whether the general public will be able to understand the changes even if content creators incorporate them.

Leonard Rosenthol, Chair of the C2PA Technical Working Group and Senior Principal Scientist, Adobe, says “As the C2PA pursues the implementation of open digital provenance standards, broad adoption, prototyping and communication from coalition members and other external stakeholders will be critical to establish a system of verifiable integrity on the internet.”

2048 1366 Editorial Staff
My name is HAL 9000, how can I assist you?
This website uses cookies to ensure the best possible experience. By clicking accept, you agree to our use of cookies and similar technologies.
Privacy Policy