Photographers Push Back on Facebook's 'Made with AI' Labels Triggered by Adobe Metadata. Do you agree “‘AI was used in this image’ is completely different than ‘Made with AI’”?

parody@lemmings.world to No Stupid Questions@lemmy.world – 203 points –
94

I think every touch up besides color correction and cropping should be labeled as "photoshopped". And any usage of AI should be labeled as "Made with AI" because it cannot show which parts are real and which are not.

Besides, this is totally a skill issue. Removing this metadata is trivial.

Some of the more advanced color correction tools can drastically change an image. There’s a lot of gray in that line as well.

DOD Imagery guidelines state that only color correction can be applied to "make the image appear the same as it was when it was captured" otherwise it must be labeled "DOD illustration" instead of "DOD Imagery"

Cropping can completely change the context of a photo.

Sure But you could also achieve a similar effect in-camera by zooming in or moving closer to the subject

A lot of photographers will take a photo with the intention of cropping it. Cropping isn’t photoshopping.

If I open an image in Photoshop and crop it, it's photoshopping.

You don’t have to open photoshop to do it. Any basic editing software will include a cropping tool.

So we agree cropping is plain and simple image editing, yes?

Yes. I think the question was should it be labeled as “photoshopped” (or probably “manipulated”). I don’t think it should. I think those labels would be meaningless if you can’t event change the aspect ratio of a photo without it being called “photoshopped”.

🤦‍♂️

There are absolutely different levels of image editing. Color correction, cropping, scale, and rotation are basic enough that I would say they don’t even count as alterations. They’re just correcting what the camera didn’t, and often available in the camera's built in software. (Fun fact, what the sensor sees is not what it presents you in a jpeg.) Then there are more deceptive levels of editing, like removing or adding objects, altering someone’s appearance, swapping faces from different shots. Those are definitely image alterations, and what most people mean when they say an image is “photoshopped” (and you know that, don’t lie). Then there’s AI, where you’re just generating new information to put into the image. That’s extreme image alteration.

These all can be done with or without any sort of nefarious intent.

3 more...
3 more...
3 more...
3 more...
3 more...
3 more...

Agreed. Photo editing has great applications but we can't pretend it's never used maliciously.

Film too, any trickery in the darkroom should be labeled because it cannot show which parts are real and which are not.

3 more...

Better title: “Photographers complain when their use of AI is identified as such”

"It was just a so little itsy bitsy teeny weeny AI edit!!"

Please don't flag AI please!

People are complaining that an advanced fill tool that’s mostly used to remove a smudge or something is automatically marking a full image as an AI creation. As-is if someone actually wants to bypass this “check” all they have to do is strip the image’s metadata before uploading it.

But they did use AI..

Right? I thought I went crazy when I got to "I just used Generative Fill!" Like, he didn't just auto adjust the exposure and black levels! C'mon!

Looks like people are finally finding out they've been using AI all along.

Seems to me that employing the use of AI to alter an image should be labeled as "made with AI". It's not made by AI, AI was merely one of the tools used.

If you don't like admitting you used AI, just strip the metadata, I guess. This feels like something you should be able to turn off in your editor's settings, but I guess Adobe hasn't implemented that.

This comment was made with AI, as my phone's keyboard uses AI to automatically complete words, in a process strikingly similar to how ChatGPT works.

I totally agree with a streamlined identification of images generated by an AI prompt. But, to label an image with "made with AI" metadata when the image is original, taken by a human, and simply used AI tools to edit is absolutely misleading and the language can create confusion. It is not fair to the individual who has created the original work without the use if generative AI. I simply propose revising the language to create distinction.

The edits are what makes it made with AI. The original work obviously isn't.

If you're in-painting areas of an image with generative AI ("context aware" fill), you've used AI to create an image.

People are coming up with rather arbitrary distinctions between what is and isn't AI. Midjourney's output is clearly AI, and a drawing obviously isn't, but neither is very post-worthy. Things quickly get muddy when you start editing.

The people upset over this have been using AI for years and nobody cared. Now photographers are at risk of being replaced by an advanced version of the context aware fill they've been using themselves. This puts them in the difficult spot of wanting not to be replaced by AI (obviously) but also not wanting to have their AI use be detectable.

The debate isn't new; photo editors had this problem years ago when computers started replacing manual editing, artists had this problem when computer aided drawing (drawing tablets and such) started becoming affordable, and this is just the next step of the process.

Personally, I would love it if this feature would also be extended to "manual" editing. Add a nice little "this image has been altered" marker on any edited photographs, and call out any filters used to beautify selfies while we're at it.

I don't think the problem is that AI edited images are being marked, the problem AI that AI generated pictures and manually edited pictures aren't.

Where I live, is very difficult to get permits to knock down an old building and build a new one. So, builders will "renovate" by knocking down everything but a single wall and then building a new structure around it.

I can imagine people using that to get around the "made with ai" label. I just touched it up!

It’s like they’re ignoring the pixel I captured in the bottom left!

Really interesting analogy.

Also I imagine most anybody who gets a photo labeled will find a trick before making their next post. Copy the final image to a new PSD… print and scan for the less technically inclined… heh

I mean you can just remove the metadata of any image, so that doesn't really matter.

simply used AI tools

Therefor, made with AI.

Or generated with AI like midjourney, therefore, made with AI.

There a huge difference between the two, yet, no clear distinction when all lumped into the label of "made with AI"

yeah, i use Lightroom ai de-noise all the time now. it's just a better version of a tool that already existed. and once that every phone does by default anyway.

And I use AI to determine the right brightness level for my phone screen (that was a feature added several android versions ago)

Artists in 2023: "There should be labels on AI modified art!!"

Artists in 2024: "Wait, not like that..."

I feel like these are two completely different sets of artists.

no, they just replaced the normal tools with ai-enhanced versions and are labeling everything like that now.

ai noise reduction should not get this tag.

I don't know where you got they from, but this post literally talks about tools such as the gen fill (select a region, type what you want in it, AI image generation makes it and places it in)

No - I don't agree that they're completely different.

"Made by AI" would be completely different.

"Made with AI" actually means pretty much the exact same thing as "AI was used in this image" - it's just that the former lays it out baldly and the latter softens the impact by using indirect language.

I can certainly see how "photographers" who use AI in their images would tend to prefer the latter, but bluntly, fuck 'em. If they can't handle the shame of the fact that they did so they should stop doing it - get up off their asses and invest some time and effort into doing it all themselves. And if they can't manage that, they should stop pretending to be artists.

I think it is a bit of an unclear wording personally. "Made with", despite technically meaning what you're saying, is often colloquially used to mean "fully created by". I don't mind the AI tag, but I do see the photographers point about it implying wholesale generation instead of touchups.

The label is accurate. Quit using AI if you don’t want your images labeled as such.

or... don't use generative fill. if all you did was remove something, regular methods do more than enough. with generative fill you can just select a part and say now add a polar bear. there's no way of knowing how much has changed.

there's a lot more than generative fill.

ai denoise, ai masking, ai image recognition and sorting.

hell, every phone is using some kind of "ai enhanced" noise reduction by default these days. these are just better versions of existing tools than have been used for decades.

This would be more suited for asklemmy, this community isn't for opinion discussions

Can't wait for people to deliberately add the metadata to their image as a meme, such that a legit photograph without any AI used gets the unremovable made with ai tag

Generative fill on a dummy layer, then apply 0% opacity

Why many word when few good?

Seriously though, "AI" itself is misleading but if they want to be ignorant and whiny about it, then they should be labeled just as they are.

What they really seem to want is an automatic metadata tag that is more along the lines of "a human took this picture and then used 'AI' tools to modify it."

That may not work because by using Adobe products, the original metadata is being overwritten so Thotagram doesn't know that a photographer took the original.

A photographer could actually just type a little explanation ("I took this picture and then used Gen Fill only") in a plain text document, save it to their desktop, and copy & paste it in.

But then everyone would know that the image had been modified - which is what they're trying to avoid. They want everyone to believe that the picture they're posting is 100% their work.

We've been able to do this for years, way before the fill tool utilized AI. I don't see why it should be slapped with a label that makes it sound like the whole image was generated by AI.

This isn't really Facebook. This is Adobe not drawing a distinction between smart pattern recognition for backgrounds/textures and real image generation of primary content.

Bad photographers complaining to be called out as bad photographers.

I don't think that's fair. AI wont turn a bad photograph into a good one. It's a tool that quickly and automatically does something we've been doing by hand untill now. That's kind of like saying a photoshopped picture isn't "good" or "real". They're all photoshopped. Not a single serious photographer releases unedited photos except perhaps the ones shooting on film.

Even finns photographers touch up their photos, either during development by adjusting how long they sit in one or the chemical processes or by using different methods of shaking/mixing processes and techniques.

If they enlarge their negatives on photo paper they often have tools to add lightness and darkness to different areas of the paper to help with exposure, contrast and subject highlighting. AKA. Dodging and burning which is also available in most photo editing software today.

There are loads of things to do to improve developed photos and been something that has always been something that photographers/developers do. People who still go with the "Don't edit photos" BS are usually not very well informed about photo history and techniques of their photography inspirations.

The image looks like OP cherry picked some replies in the original thread. I wonder how many artists still want AI assisted art to be flagged as such.

EDIT The source is also linked under the images. They did leave out all the comments in favour of including AI metadata, but naturally they're there in the source linked under the images.

💯

Absolutely cherry picked. Let us know if you peruse the source:

Without cherry picking… imagine these will be resized to the point of illegibility:

It's unreasonable to make them illegible for no good reason; you could've included them as-is, possibly in multiple, smaller images. It's also far more common to just share a link rather than an image post, as we'll have to see the link anyway.
I didn't see the source, though, I've updated my comment for that.

Thanks for the edit. We all love that intellectual honesty!

Don’t miss this absolute roast though:

Roasted and salted 🥜


Now -

1: I should’ve been more clear… those full screen screenshots are so enormous, Lemmy has to compress them for cost and UX reasons.

2: Screenshot over link is a very intentional choice. Even if you’re positive you would’ve clicked based on the title, there are some great responses in this thread that I guarantee you we would not have been blessed with if this post had been a link instead of an image.

Everyone is busy. Lots of us work away on keyboards all day, and we hop on here just to scroll casually. Some huge forum thread? Forget it! A little screenshot that has teasers and can be digested bit by bit, with the leading post in the image helping folks decide whether they care enough to read the rest of the image and furthermore to find a source? (either by an OP or commenter’s source link, or exact match web search of an OCR’d phrase from the image) That’s the best shot we have at easing in as many people as possible into a topic. (Do feel bad for the vision impaired, hopefully the source link is a decent standin.) But for 98% of us this is prob the way. Aight maybe 95%, you got a good community response to your comment :)

Thanks for chiming in m’lord

He just won’t stop!

Aight I repeated “cherry picked” earlier… no:

“Curated.” Was happy to curate a few of the more interesting comments for our community.

If I weren’t so lazy I might’ve found another comment in favor of the labeling to bump up the screenshotted proportion of replies in support from the 25% seen in my OP. Still, think I did an aight job.

Okayyy night now haha

As a photographer I'm a bit torn on this one.

I believe AI art should definitely be labeled to minimize people being mislead about the source of the art. But at the same time the OP on the Adobe forums post did say they used it as any other tool for touching up and fixing inconsistencies.

If I were to for example arrange a photoshoot with a model and they happened to have a zit that day on their forehead of course I'm gonna edit that out. Or if I happened to have an assistant with me that got in the shot but I don't want to crop in making the background and feel of the photo tighter I would gladly remove that too. Sure Adobe already has the patch, clone and even magic eraser tool (Which also uses AI, that might or might not mark photos) to do these fix-ups but if I can use AI, that I hope is trained on data they're actually allowed to train on, I think I would prefer that because if I'm gonna spend 10 to 30 minutes fixing blemishes, zits and what not I'd much prefer to use the AI tools to get my job done quicker.

If the tools were however used to rigorously change, modify and edit the scene and subject then for sure, it might be best to add that.

Wouldn't it be better to not discourage the use of editing tools when those tools are used in a way that just makes one's job quicker? If I were to use Lightrooms subject quick selection, should it be slapped on then? Or if I were to use an editing preset created with AI that automatically adjusts the basic settings of an image and further my editing from that, should the label be created then? Or if I have a flat white background with some tapestry pattern and don't want to spend hours getting the alignment of the pattern just right as I try to fix a minor aspect ratio issue or want to get just a bit more breathing room on the subject and I use the mentioned AI tool in the OP.

Things OP mentioned in his post and the scenarios I mentioned are all things you can do without AI anyways it just takes a lot longer sometimes, there's no cheating in using the right tool for the right job IMO. I don't think it's too far off from someone who makes sculptures in clay uses an ice scream scoop with ridges to create texture or a Dremel to touch up and fix corners. Or a painter using different tools and brushes and scrapers to finish their painting.

Perhaps a better idea would be if we want to make the labels "fair" there should also be a label that the photo has been manipulated by a program in general or maybe add a percentage indicator to see how much of it has been edited specifically with AI. Slapping an "AI" label on someone because they decided to get equal results by using another tool to do normal touch-ups to a photo could potentially be damaging to ones career and credibility when it doesn't say how much of it was AI or in what reach, because now there's the chance someone might be looking for their next wedding photographer and be discouraged because of the bad rep regarding AI.

trained on data they're actually allowed to train on

That’s the ticket. For touchups, certainly, that’s the key: did theft help, or not?

Indeed, if the AI was trained based on theft it's neither right on their part or ethical on mine.

I did some searching but sadly don't have time to look into it more but there were some concerning articles that would suggest they have either used shady practices to get their training data or users having to manually check an opt out box in the app settings.

I can't make an opinion on it right now before looking into it more but my core argument about using AI itself in this manner, even if that data was your own on your own trained AI using allowed resources, I still believe somewhat holds.

I'm not sure of the complaint, is the tag not accurate? If you use AI to make something are you not making it with ai? Like if I use strawberry to make a cake would the tag made with strawberries be inaccurate?

Like I failed to see the argument, if you don't want to be labeled as something accurate don't use it otherwise deal with it.

I do think it's a problem when 100% of people seeing "made with AI" will assume the entire thing is AI-generated, even if all you did was use AI for a minor touch-up. If it's really that trigger happy right now, I think it'd make sense for it to be dialled down a bit.

Would all my photos taking on a pixel or iPhone have this label then?

The complaint the photographer is making is that it's an actual photograph where a small portion is made or changed with AI.

They list expanding the edges of the image to change the aspect ratio, and removing flaws or unwanted objects etc.

Removing flaws and objects at least is a task that predates modern computers - people changed the actual negatives - and tools to do it have improved so much a computer can basically do it all for you.

I think people should just say how they modified the image - AI or not - since airbrushed skin, artificial slimming, and such have been common complaints before AI manipulation, and AI just makes those same problematic things easier.

The biggest use of AI in my editing flow is masking. I can spend half an hour selecting all the edges of a person as well as I can, or I can click the button to select people. Either way I do the rest of my edits as normal.

I saw this coming from a mile away. We will now have to set standards for what's considered "made by AI" and "Made with AI"

I agree pretty heartily with this metadata signing approach to sussing out AI content,

Create a cert org that verifies that a given piece of creative software properly signs work made with their tools, get eyeballs on the cert so consumers know to look for it, watch and laugh while everyone who can't get thr cert starts trying to claim they're being censored because nobody trusts any of their shit anymore.

Bonus points if you can get the largest social media companies to only accept content that has the signing and have it flag when signs indicate photoshopping or AI work, or removal of another artist's watermark.

That simply won't work, since you could just use a tool to recreate a Ai image 1:1, or extract the signing code and sign whatever you want.

There are ways to secure signatures to be a problem to recreate, not to mention how the signature can be unique to every piece of media made, meaning a fake can't be created reliably.

How are you gonna prevent recreating a Ai image pixel by pixel or just importing a Ai image/taking a photo of one.

Importing and screen capping software can also have the certificate software on and sign it with the metadata of the original file they're copying, taking a picture of the screen with a separate device or pixel by pixel recreations could in theory get around it, but in practice, people will see at best a camera image being presented as a photoshopped or paintmade image, and at worst, some loser pointing their phone at their laptop to try and pass off something dishonestly. Pixel by pixel recreations, again, software can be given the metadata stamp, and if sites refuse to accept non stamped content, going pixel by pixel on unvetted software will just leave you with a neat png file for your trouble, and doing it manually, yeah if someone's going through and hand placing squares just to slip a single deep fake picture through, that person's a state actor and that's a whole other can of worms.

ETA: you can also sign the pixel art creation as pixel art based on it being a creation of squares, so that would tip people off in the signature notes of a post.

The opposite way could work, though. A label that guarantees the image isn't [created with AI / digitally edited in specific areas / overall digitally adjusted / edited at all]. I wonder if that's cryptographically viable? Of course it would have to start at the camera itself to work properly.

Signing the photo on the camera would achieve this, but ultimately that's just rehashing the debate back when this Photoshop thing was new. History shows us that some will fight it but ultimately new artistic tools will create new artistic styles and niches

I imagine we’ll need specialized hardware in the future.

The president allegedly made a gaffe on film? Let’s see that chain of custody, that raw file hash on the Canon/RED/Sony servers…

Ooh, something to this end was released in 2022!

I saw a video posted by someone who claimed to have taught their cat how to skateboard. and at the bottom it was tagged made with AI.

meta w

Did they just e.g. remove a passing car from the background*, and will tags on some images lead to untagged fake images being trusted more? Oh this fun new world we’re in.

*as someone else pointed out, if it was a minor edit, was the underlying technology using legit training data or unlicensed stuff

no it was an AI generated video of a cat using a skateboard

I disagree with their complaints. If AI was used in any way, it should be labelled as such, no matter how small the adjustments were.

What is the point of the label at all?

To appease the artists worried about "fake" art somehow replacing the "real"art, while the big social somehow profits. They just didn't think leopards would eat THEIR faces...

You aren't wrong. It's entirely about status and needing to stigmatize, penalize and limit "fake" art because the artists in question are worried it will cut into the work available to them in the form of things like commissions.

Hey guys, I cheated in my exam using AI but I was the one who actually wrote down the answer. Why did I fail?

That person who makes the peanut analogy needs a slap in the head.

It’s exaggerated but it gets the point across: I too would like to know if AI tools were used to make even part of the image.

There’s a reason any editing is banned from many photography contests.

If they want to make a distinction between “made using AI” and “entirely AI generated”, sure. But “made using AI” completely accurately describes an image that used AI to generate parts of the image that were inconvenient in the original photo.

1 more...
1 more...