This is a non safe for work, as in lewd, instance.
It’s safe to assume that anything you see here will be lewd.
Just letting you know. If you view the content and then pull a surprised pikachu when you see big anime tiddies, Judy Hopps getting railed or some furry vore… Then it’s on you.
Should this popup continue to show up, you may want to enable cookies or disable privacy focused addons on your browser, I assure you we won’t track our user.
Should that fail, some users claim they got rid of it by hammering the ok button.
Cameras don’t cryptographically sign the images they take. Even if that was added, there are billions of cameras in use that don’t support signing the images. Also, any sort of editing, resizing, or reencoding would make that signature invalid. Almost no one is going to post pictures to the web without any sort of editing. Embedding 10+ MB images in a web page is not practical.
We aren’t talking about current cameras. We are talking about the proposed plan to make cameras that do cryptographically sign the images they take.
Here’s the link from the start of the thread:
https://arstechnica.com/information-technology/2024/09/google-seeks-authenticity-in-the-age-of-ai-with-new-content-labeling-system
This system is specifically mentioned in the original post: https://www.seroundtable.com/google-search-image-labels-ai-edited-38082.html when they say “C2PA”.