In an era of scams and fake news, so-called “deepfakes” are the latest assault on our relationship with reality.
If you’re unfamiliar, deepfakes are videos generated with help from artificial intelligence that show a recognizable figure (like Barack Obama or Mark Zuckerberg, for example) saying things that they’ve never actually said. By putting false words in the mouths of prominent, powerful people, deepfakes are a perceived threat purportedly true information.
But if there’s one thing we know about blockchain, the database technology that props up popular cryptocurrencies like Bitcoin and Ethereum, we know that it excels at verifying and confirming what is real. Did Alice actually send Bob $100 of crypto? That’s an easy confirmation to make with access to an open blockchain.
Now a 72-page report issued by Witness Media Lab goes in depth on the tools that stand a chance to push back against the threat of deepfakes, and blockchain is one of them. It’s not the first time this idea has been floated, but it’s certainly one of the more thoughtful explanations we’ve seen.
The general idea goes like this: images, videos, and audio can be cryptographically signed, geotagged, and timestamped to establish their origins. This kind of “verified capture” calls for applications to perform a number of checks, ensuring that transmitted data conforms with the source material. That media can be assigned a cryptographic hash based on the image or audio data it contains — comparing that source hash to another in search of any mismatches will easily tell you if the media has been manipulated or not.
In other words, blockchain can verify source media against copycats or outright manipulations the same way it verifies crypto transactions. But this isn’t a totally bulletproof approach — it effectively calls for us to put trust in a technical system without considering its limits. The report cites media forensics expert Hany Farid saying that any finished blockchain solution for fighting deepfakes is still years away due to the complexity involved here.
Blockchains are still vulnerable to sophisticated attacks against their governance structures, or the notorious 51% attacks that have caused problems for the crypto community in the past. But there’s enough promise here that people are taking note. We got in touch with Corin Faife, senior coordinator for Witness, who offered a useful analogy:
When we buy food from a supermarket, we generally expect it to be packaged in such a way that it can’t be tampered with — sealed plastic, stickers over the wrapper, and so on. It doesn’t guarantee the food will be perfect: it still might not taste good, and we can’t prevent it from spoiling if you leave it out too long, but what you do know is that the package hasn’t been interfered with on its way to you.
Faife continued: “These authenticity measures propose something similar for video: it’s not an ultimate guarantee of truth and shouldn’t be taken as an endorsement of the content itself, but it does allow you to confirm that a media item hasn’t been tampered with on its way to you from the original point of capture.”
At a time when deepfake technology seems to only be improving, it’s good to know that we have tools left in our figurative toolbelt for pushing back against them.