Seems to me that a camera like this is necessarily, at least in part, a closed system that blocks you from controlling the software or hardware on the device you supposedly own. It's hard for me to think this is a good direction. And as others have pointed out, it can't prevent attacks through the analog hole, e.g. photographing a display.
It's not feasible or desirable for our hardware devices to verify the information they record autonomously. A real solution to the problem of attribution in the age of AI must be based on reputation. People should be able to vouch for information in verifiable ways with consequences for being untrustworthy.
> camera like this is necessarily, at least in part, a closed system that blocks you from controlling the software or hardware on the device you supposedly own
Attestation systems are not inherently in conflict with repurposeability. If they let you install user firmware, then it simply won’t produce attestations linked to their signed builds, assuming you retain any of that functionality at all. If you want attestations to their key instead of yours, you just reinstall their signed OS, the HSM boot attests to whoever’s OS signature it finds using its unique hardware key, and everything works fine (even in a dual boot scenario).
What this does do is prevent you from altering their integrity-attested operating system to misrepresent that photos were taken by their operating system. You can, technically, mod it all you want — you just won’t have their signature on the attestation, because you had to sign it with some sort of key to boot it, and certainly that won’t be theirs.
They could even release their source code under BSD, GPL, or AGPL and it would make no difference to any of this; no open source license compels producing the crypto private keys you signed your build with, and any such argument for that applying to a license would be radioactive for it. Can you imagine trying to explain to your Legal team that you can’t extract a private key from an HSM to comply with the license? So it’s never going to happen: open source is about releasing code, not about letting you pass off your own work as someone else’s.
> must be based on reputation
But it is already. By example:
Is this vendor trusted in a court of law? Probably, I would imagine, it would stand up to the court’s inspection; given their motivations they no doubt have an excellent paper trail.
Are your personal attestations, those generated by your modded camera, trusted by a court of law? Well, that’s an interesting question: Did you create a fully reproducible build pipeline so that the court can inspect your customizations and decide whether to trust them? Did you keep record of your changes and the signatures of your build? Are you willing to provide your source code and build process to the court?
So, your desire for reputation is already satisfied, assuming that they allow OS modding. If they do not, that’s a voluntary-business decision, not a mandatory-technical one! There is nothing justifiable by cryptography or reputation in any theoretical plans that lock users out of repurposing their device.
> A real solution to the problem of attribution in the age of AI must be based on reputation
This is actually one of the theoretical predictions from Eliezer Yudkowsky, who says that as information becomes less and less verifiable, we're going to need to re-enter a pre-information-era - where people will have to know and trust the sources of important information they encounter, in some cases needing to hear it first hand or in person.
It's not enough that the photograph is signed and has metadata. Someone has to interpret that metadata to decide authentic versus not. One can have an "authentic" photo of a rear projection screen. It wouldn't be appropriate to have an "authentic" checkmark next to this photo if it claims to not be a photo of a rear projection screen. The context matters to authenticity.
Secondly, the existence of such "authentic" photos will be used to call all non-authenticated photos into doubt.
So it doesn't even really solve any problem, but creates new problems.
Yes, that might make these fake-proof cameras popular, to the point where people start putting in the necessary effort to defeat them by monkeying around with the time server and the depth sensor and the gps signal. Then you get a really well-supported fake image that's very effective because it's authenticated.
Practically I think there are situations where it is not so black and white. Like camera footage used as evidence in a court case. Signing a video with a public key would give some way to verify the source and chain of custody. Why wouldn't you in that situation? At a minimum it makes tapering harder and weakens false claims that something has been tampered with.
I don't think reputation gets you that far alone, we already live in a world where misinformation spreads like wildfire through follower counts and page ranks.
The problem is quality takes time, and therefore loses relevance.
We need a way to break people out of their own human nature and reward delayed gratification by teaching critical thinking skills and promoting thoughtfulness.
I sadly don't see an exciting technological solution here. If anything it's tweaks to the funding models that control the interests of businesses like Instagram, Reddit, etc.
Why can't posting a verifiably true image create as much or more instant gratification as sending a fake one? It will probably be more gratifying, once everyone is sending fake ones and yours is the only real one (if people can know that).
Sure, but you were asking why truth is less gratifying.
Also, "truth" is clearly something that requires more resources. It is a lifelong endeavour of art/science/learning. You can certainly luck into it on occasion but most of us never will. And often something fictional can project truth better than evidence or analysis ever can. Almost everything turns into an abstraction.
No, that is a nihilistic belief which does not get buildings built or software written.
One may "luck" into truth by being born in a poor neighborhood or by living in a warzone, and having eyes and a camera. Or by being rich and invited to a club and having a microphone.
Truth is everywhere, but capturing it is expensive. The tax on truth is the easy spread and generation of lies. The idea that the fictional can encapsulate truth is of course true, but it doesn't mean everything is better an abstraction. Losing a leg is more powerful as a reality than as an abstraction. Peddlers of falsehoods, then, only win when truth can be abstracted.
Moreover: People who read literature read it knowing it stands in for truth. People who watch TikTok believe it is true, and are disenchanted when shown otherwise. More power resides in a grain of truth than a mountain of falsehood; so any tool for proving veracity will always have an outsized value against tools for generating fakes.
The last redoubt of propagandists when faced with the threat of truth is to claim that no one cares anymore what's true. But that's false. In fact, that's when they begin to fool themselves. It's not that no one in China or Russia values the truth, for instance. It's just that they say what they're told to say, and don't believe a word of it.
We do not need "proof". We lived without it, and we'll live without it again.
I grew up before broadband - we survived without photographing every moment, too. It was actually kind of nice. Social media is the real fluke of our era, not image generation.
And hypothetically if these cryptographic "non-AI really super serious real" verification systems do become in vogue, what happens if quantum supremacy beats crypto? What then?
You don't even need to beat all of crypto. Just beat the signing algorithm. I'm sure it's going to happen all the time with such systems, then none of the data can be "trusted" anyway.
I'm stretching a bit here, but this feels like "NFTs for life's moments". Designed just to appease the haters.
You aren't going to need this stuff. Life will continue.
This worked because we also used to have significantly better and more trustworthy news organisations that you could just trust did the original research and verified the facts. Now they just copy stories off Reddit and make up their own lies.
Back to the time before photographs then - the 1800s.
Crime scene photographs won't be evidence anymore. You photograph your flat (apartment) when you move in to prove that all the marks on the walls were already there and that won't be evidence anymore. The police mistreat you but your video of it won't be evidence either. etc
It's not feasible or desirable for our hardware devices to verify the information they record autonomously. A real solution to the problem of attribution in the age of AI must be based on reputation. People should be able to vouch for information in verifiable ways with consequences for being untrustworthy.