If Deepfakes are Goliath, Saito is David
If deception is a beast, temporal commitments are your sword.
Machine learning works to produce data of some kind by first processing and tuning itself with massive amounts of similar, existing information - it is, in a way, the ultimate deceiver.
A deepfake is innate in the way the process works, the ability to mimic the vast input data via a process of its vast and repetitive statistical machinery. Modified voice and video are nearing a point of no return, and soon all of what humans are digitally exposed to will be burdened by the valid question: is this real, or is this altered?
Trust in online sources, where new identities are created at will, is already tenuous, and soon to be coupled with altered and convincing footage, audio, text. No content, if nothing is done, will be discernible on its own from fakery; there is an argument to be made that the enhanced ability to deceive will set us back on the front of knowledge distribution to before the internet even existed; any existing video will be subject to modification which may cleverly manipulate, divide and sow distrust. Access to this technology is in its early years already scaled and sold for affordable rates; for the sake of the history of all media currently existing, equally strong conservation technology must be employed.
Detection of deepfakes is not impossible, but the approach cannot scale - it is an expensive arms race. Rather than detecting fakes, which are numerous and shifting, the smaller body of original content must provide evidence to its own authenticity. It has been possible for decades to authenticate content with public keys, but when someone strips the original signage, modifies the content, adds their own signature - the history becomes muddled. Not only that, but the identities bound to public keys most often rely on trust in public-key infrastructures, which have already been demonstrated to be attractive and submissive to power.
What's required is a permanent record to mark chronological priority on data; a universal time-stamping service. Timestamping proves the data which it references is at least as old as its stamp. Records of any kind may prove their time in history, leveraged off the authority, of their timestamps. Original and authentic content will always be able to produce an older timestamp than any fakes or modifications which are derived from that content. Tied with a public key system which cannot be censored or compromised, and what comes out of the oven is a blockchain. Blockchain is the most secure digital timestamping service humans have yet invented, but a growing permanent record with a seemingly inherent compromise between volume, affordability, and censorship resistance is not suitable to grant protection to all concerned about protecting their reputation and legacy - unless that record medium is the Saito Blockchain.
Saito Advantages
Saito has two main properties which make it the best option for a universally accessible timestamping service which endures without intervention:
The same money that funds security is also the money which funds scale; no compromises are made between the two, meaning the blockchain has no inherent constraints on scale or security, they instead grow together. Traditional blockchain consensus makes an explicit compromise here, but a timestamping service with the stakes and scale called for to preserve human history cannot falter in accessibility or trust.
And secondly: Saito blockchain is capable of removing data, lowering the cost to publish it on-chain in the first place. This may seem like a problem, since timestamps need to be permanent, but the removed blocks can be verified as having existed, ordered correctly, by archiving the much smaller, constant sized block headers rather than all data within a block. Only the Merkle Branch containing the transaction holding the timestamp of interest need be preserved by the creator of the stamp, and only that small chain of block headers be stored by anyone wishing to verify a stamp from any point in the full blockchain's history. This allows massive amounts of data to be included without sustainability concerns.
Combating Deepfakes
Armed with a public key, and a small Saito balance, data may ensure its place in history through its hash included in the data field of a transaction. Should any content directly derivative of that media being made as an alteration, it will never be able to produce a timestamp older than the original data. The Merkle Branch holding that transaction is archived alongside the data of interest, and together they may prove against the header-chain their place in history.
In addition the temporal authentication, the public key aspect of blockchain provides secure identification - a key which consistently signs the earliest data tied to some real-life personhood gains authority as accurately representing them in the digital realm - which deepfakes, again, will be unable to replicate. And of course deepfakes are only the most sophisticated method of digital deceit - when data is de-materialized it loses its worldly proof of age - any digital data may be altered arbitrarily with no evidence, not even the original document. Saito will serve as the fossil record for history immaterial.
Extras:
Technical and Economic Details:
The cost of storing timestamps Merkle Branches is very economic. To prove data existed in any block is to personally store a proof which is exponentially smaller than the block itself and verify it against the chain of block headers, which are the smallest representation a block can have that verifies its order in a chain. That arithmetic follows:
The space required to store the minuscule block headers grows at a constant, linear rate as blocks are added no matter how much data is in that block, and the space required for individual users to hold a Merkle Branch verifying their timestamp grows at log_2 the number of transactions in that block. If one million transactions are recorded in a single block, each user need only hold 20 hashes to verify their timestamp exists within it. One trillion transactions per block requires a Merkle Branch with only 40 hashes. Even if everyone on Earth is timestamping hundreds of time each day, the space required to hold verification information on those timestamps long into the future remains manageable by even weak mobile devices. The constraint on how fast hardware can produce and distribute blocks will be the first constraining factor, at which point similarly prolific scale for timestamping may be kept by putting slightly more burden on users to store information.
Other Timestamp Use Cases
Intellectual Priority / Artistic Providence
Prove ownership of an idea or creation - timestamp it before publicizing it. Those attempting to take credit for replications will not be able to produce an older timestamp than yourself.
Media is often reformatted as it is sent across internet. Digitally signing media is not by itself effective when that signature can be removed or when it is rendered invalid under any file transformation or compression. Instead the original file hash can be posted to the blockchain and the author may reference its likeness and history.
Though considered trite, this can serve to give credit to the authors of the most influential and transient content on the internet: memes - created and recreated constantly, the original authors often have no way to convincingly credit themselves (even directly after publishing) for what may become an important cultural touchstone.
Secret Commitments
A message not ready for this world, or a prediction to be kept secret. Timestamp the data, but do not reveal it. Classified documents, rightful opinions which may ruin a reputation, or predictions with too much at stake may all be committed to but not revealed until the time is right. The authority of a timestamp ensures these messages were made before they were stamped.
Supply Chain
Supply chains was a flagship theoretical blockchain use case for quite some time, but never truly materialized. The premise being that warehouses using keys tied to packages can prove they received them and thus take provable responsibility for them until the next signer gets a hold of them. Critics have had the last laugh: a blockchain is nothing more than a slow database, any use case on them is better done on a traditional database.
Of course a shared record with parties of mis-aligned interests proves that theory wrong - but the true reason blockchain never rose to the occasion were issues of scaling. There have been numerous scaling solutions which ignore the base layer problems and allow some convoluted increase in throughput to transactions or computation - but nothing until Saito has increased scaling on the base layer, or equivalently, reduced the cost for posting data on-chain without making compromises on security.