A bride-to-be discovers a reality-bending mistake in Apple's computational photography

☆ Yσɠƚԋσʂ ☆@lemmy.ml to Technology@lemmy.ml – -1 points –
A bride to be discovers a reality bending mistake in Apple's computational photography
appleinsider.com
8

You are viewing a single comment

edit: since it wasn't obvious to readers, this is a hypothetical of a techno-distopian future...

Imagine taking a selfie only to see an image of you holding a knife. But there are no knives in your hands. Another snap. Same image displays on the screen, but there's a person of particular importance in the background. You turn your head but are all alone. Nobody is around. You're starting to freak out. Are you being pranked, maybe your phone has been hacked. Another shutter sound effect and you see an image of yourself over a victim. You frantically open your camera's gallery, thinking your eyes are fooling you, but the photos are the same. And are sent to the cloud. Deleting isn't allowed, AI detected felonious imagery. You've been reported to multiple agencies. You are alone. There are no knives in your hands.

What? Again, the images are real. They just take 50 shots and use the best frame and thus movement can happen in between. They’re not using AI to make it into what they think the image should be. They’re just stitching together a bunch of frames taken from basically the same time (1-2 seconds) into one picture.

I was taking the comment thread (about how dangerous this could be in photographic evidence) a step further by imagining a hypothetical techno-distopian future where corporate controlled AI alters photos to make them look better, but in reality, it creates a back door where incriminating evidence can be created.