There are a variety of interesting techniques for detecting altered images, including frames from videos, such as statistical analyses of discontinuities in the brightness, tone, and other elements of pixels. The problem is that any tool that is smart enough to identify features of a video that are characteristic of fakes can probably also be used to erase those features and make a better fake. The result is an arms race between fakers and fake detectors that makes it hard to know if an image has been maliciously tampered with. Some have predicted that efforts to identify AI-generated material by analyzing the content of that material are doomed. This has to led to a number of efforts to use another approach to proving the authenticity of digital media: cryptography. In particular, many of these concepts are based on a concept called "digital signatures."
If you take a digital file -- a photograph, video, book, or other piece of data -- and digitally process or "sign" it with a secret cryptographic "key," the output is a very large number that represents a digital signature. If you change a single bit in the file, the digital signature is invalidated. That is a powerful technique, because it lets you prove that two documents are identical -- or not -- down to every last one or zero, even in a file that has billions of bits, like a video.
Under what is known as public key cryptography, the secret "signing key" used to sign the file has a mathematically linked "verification key" that the manufacturer publishes. That verification key only matches with signatures that have been made with the corresponding signing key, so if the signature is valid, the verifier knows with ironclad mathematical certainty that the file was signed with the camera manufacturer's signing key, and that not a single bit has been changed.
Given these techniques, many people have thought that if you can just digitally sign a photo or video when it's taken (ideally in the camera itself) and store that digital signature somewhere where it can't be lost or erased, like a blockchain, then later on you can prove that the imagery hasn't been tampered with since it was created. Proponents want to extend these systems to cover editing as well as cameras, so that if someone adjusts an image using a photo or video editor the file's provenance is retained along with a record of whatever changes were made to the original, provided "secure" software was used to make those changes.
For example, suppose you are standing on a corner and you see a police officer using force against someone. You take out your camera and begin recording. When the video is complete, the file is digitally signed using the secret signing key embedded deep within your camera's chips by its manufacturer. You then go home and, before posting it online, use software to edit out a part of the video that identifies you. The manufacturer of the video editing software likewise has an embedded secret key that it uses to record the editing steps that you made, embed them in the file, and digitally sign the new file. Later, according to the concept, someone who sees your video online can use the manufacturers' public verification keys to prove that your video came straight from the camera, and wasn't altered in any way except for the editing steps you made. If the digital signatures were posted in a non-modifiable place like a blockchain, you might also be able to prove that the file was created at least as long ago as the signatures were placed in the public record.