An interesting couple of days talking content authenticity and C2PA with news industry folks
=> More informations about this toot | More toots from chrisneedham@w3c.social
@chrisneedham Would be good to catch up on that some time. We (as a team) ducked out a while back until things were further along - feels like that might be happening now(?)
=> More informations about this toot | More toots from tdp_org@mastodon.social
@tdp_org Sounds good, yes. There's certainly growing interest, we had about 80 people from all over the world at the event yesterday and today
=> More informations about this toot | More toots from chrisneedham@w3c.social
@chrisneedham Oh nice! Would be useful to catch up on the current state at some stage and also to get your steer on whether it's a good time for us to start thinking about how we'd integrate/enable it
=> More informations about this toot | More toots from tdp_org@mastodon.social
@chrisneedham What's your impression of its usability and generality? I heard a rumor that it was being accidentally designed for only large entities to use, which would be a shame.
I imagined that it would have two modes: 1 where the hardware has a secure element that signs pictures (and we collectively distrust hardware that gets hacked), and a second where anyone can create a private key and assert modifications with it (and we aggregate and distrust folks who lie)?
=> More informations about this toot | More toots from jyasskin@hachyderm.io
@chrisneedham In (at least) the second mode, we'd need at least one governance body (kinda like a certificate authority, maybe) to help everyone decide whether to trust the signing keys.
[#]KnowledgeCommons...
=> More informations about this toot | More toots from jyasskin@hachyderm.io
@jyasskin There are two main strands to it. One is C2PA itself which is about signed metadata that's bound to content (images, video, audio, anything with a container format), where the signing is done by the hardware or software used to capture and edit. This creates a chain of metadata for how that content got created, what edits were made, etc. It's not about saying what's "true" or not, just verifiably how it came to be
=> More informations about this toot | More toots from chrisneedham@w3c.social
@jyasskin Now this in itself brings risks that will need to be worked through, but I understand there's a redaction feature that allows sensitive details to be removed.
=> More informations about this toot | More toots from chrisneedham@w3c.social
@jyasskin The second part, which a related group called Creator Assertions WG is working on, is around "identity", i.e., the "who" created and published the media. IPTC (news industry standards) are currently looking to set up a CA that can issue certs to news media orgs (and presently distinct from the CA for device/editing software certs).
=> More informations about this toot | More toots from chrisneedham@w3c.social
@jyasskin The BBC use case involves the latter much more than the former. It wants to be able to show "this content genuinely comes from us", regardless of where you see it (on social media, etc), because there's lots of misleading content out there using our branding etc.
=> More informations about this toot | More toots from chrisneedham@w3c.social
@chrisneedham I stumbled over "signing is done by the... software". Software can't sign anything in a trustworthy way; only the entity running the software can sign things. Is there an assumption here that only cloud services can be trusted in this ecosystem, and where does that leave people who edit photos on their own hardware?
=> More informations about this toot | More toots from jyasskin@hachyderm.io
@jyasskin AIUI, with C2PA images can be signed by a camera, or maybe a phone camera app if running on a "trusted" hardware / platform. I don't think it relies on limiting to cloud services. Similarly, a native editor app could include its signing cert. But ... reverse engineering, how it works with OSS tools??
=> More informations about this toot | More toots from chrisneedham@w3c.social
@jyasskin The community involved seem genuinely open to hearing constructive feedback and working to address concerns.
=> More informations about this toot | More toots from chrisneedham@w3c.social
@jyasskin One of the uses we see already is AI image generators using it to label their content. Also https://petapixel.com/2024/10/16/youtubes-new-content-credentials-point-toward-a-more-transparent-future-for-video-content/ (edited to include a better link)
=> More informations about this toot | More toots from chrisneedham@w3c.social
@chrisneedham A native editor app cannot include its signing cert when distributed to customer machines, even if it's closed source. :) If that signing cert is trusted, and the software has restrictions that a malicious user might want to get around, the signing cert will quickly be extracted and used to sign content generated without those restrictions.
If the WG thinks they're going to safely distribute signing keys in software, we need to get them some security review ASAP.
=> More informations about this toot | More toots from jyasskin@hachyderm.io
@jyasskin Right... So I don't know the details, but I fully agree with the need for review
=> More informations about this toot | More toots from chrisneedham@w3c.social This content has been proxied by September (3851b).Proxy Information
text/gemini