The battle is on. The forces of truth against forces of deception. With visual AI making it easier to fake visual content, its credibility is at stake. And with it, the income of thousands upon thousands of people worldwide who depend on the credibility of visuals to thrive: Newspapers, magazines, photographers, newswires, webcasters, television news, videographers, journalists, photo agencies amongst many others.
The stakes are very high. If we no longer trust our photos or videos, we lose our primary knowledge of what is truly happening outside of our immediate surroundings. With it, the ability for democracies and societies to function properly.

Trust in accountability

We wrote in the past that the best way to beat ( as in, uncover its deceptive intent) deepfakes and manipulated images are by revealing its source. If we know who is the creator of a video or photograph, we are more likely to know if it is real or not. If anything, we will be alerted to its potential to deceive.

Introducing accountability via verified authorship will restore trust © Bernard-Herman/Unsplash
Introducing accountability via verified authorship will restore trust © Bernard-Herman/Unsplash

When we read an article, we seek to find out the author to assess its level of credibility. An author with a long history of proven trust reliability ( and a few Pulitzer to her name) will undoubtedly be more trusted than someone just starting out of college. The publication, often, will increase or decrease someone’s credibility. An article written by a complete unknown published in the New York Times will be trusted. The same article published by the same author on Breitbart, much less likely so.

Revealing authorship also forces accountability. It creates a referential history. Together, they are clear markers for trust. While deception hides in the shadows of anonymity, truth needs no filters. Thus, enforcing authorship creates a higher level of credibility.

The Content authenticity Initiative

This is precisely what the NYT, Adobe, and Twitter have decided to put in place with the Content Authenticity Initiative launched in November of 2019. Create a framework that allows for the creation and preservation of the authorship of visual content, one that can be used by anyone from publishers to tech companies. With it, consumers will be able to identify the source of a photo or video, allowing them to make an informed decision on the credibility of a file.

To establish authenticity, one has to define both authorship and integrity clearly: The file was created by this person/entity ( authorship), and the record has not been tampered with. Authorship, as we have seen above, is fundamental to a file’s credibility. Without it, it is impossible to assess the intent. Integrity is showing if and how the file was edited. Its a show of hands, publicly declaring any alteration. Any intention to deceive, like an object/person removal or replacement, can be clearly acknowledged.

It seems simple enough. But there are many issues. Claiming authorship should be a choice, not an obligation. Under a totalitarian regime, authors of photos or videos will not want to have files associated with them, as they would risk imprisonment or death. But letting authors/creators decided if and when they declare ownership changes little of today’s situation.

The same goes for integrity. Graphic designers and Photoshop artists are not likely to publish their file editing history as it would be releasing their trade secrets. Thus alteration history should also be an option and not an obligation, leaving today’s situation pretty much unchanged.

Authenticity should respect anonymity. Photo by Kaique Rocha from Pexels
Authenticity should respect anonymity. Photo by Kaique Rocha from Pexels

Will it work?

For a project like the CAI to work, strong incentives should be in place. It could come from various sources. Publications, like the NYT or Vice News, could decide not to accept photos/videos that do not have follow the Authenticity Framework. Platforms like Twitter or Facebook could degrade posting who publish non-authenticated content. Anonymous content would need to be vetted by trusted publishers.
Authenticity could also be linked to copyright and license payments, as proposed by Article 17 of the European directive on Copyright. As a framework on authorship, the CAI could be used by tech companies to redistribute revenue to creators. Even designers might change their minds if there is revenue involved.

Out of six senses, vision is, by far, the one we trust the most for critical information. Studies show that if receiving conflicting information from our senses of sound and touch, for example, vision is always the sense we rule correct. It is our primary source of trust. If we see it, it exists. If not, it might not be real. If we can no longer believe what we see, our world will be torn apart.

Main image by Sven Lachmann/Pixabay

Author: Paul Melcher

Paul Melcher is the founder of Kaptur and Managing Director of Melcher System, a consultancy for visual technology firms. He is an entrepreneur, advisor, and consultant with a rich background in visual tech, content licensing, business strategy, and technology with more than 20 years experience in developing world-renowned photo-based companies with already two successful exits.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.