• Skull giver@popplesburger.hilciferous.nl
    link
    fedilink
    arrow-up
    7
    ·
    6 months ago

    Looks like people are finally finding out they’ve been using AI all along.

    Seems to me that employing the use of AI to alter an image should be labeled as “made with AI”. It’s not made by AI, AI was merely one of the tools used.

    If you don’t like admitting you used AI, just strip the metadata, I guess. This feels like something you should be able to turn off in your editor’s settings, but I guess Adobe hasn’t implemented that.

    This comment was made with AI, as my phone’s keyboard uses AI to automatically complete words, in a process strikingly similar to how ChatGPT works.

    • Sensitivezombie@lemmy.zip
      link
      fedilink
      arrow-up
      2
      ·
      6 months ago

      I totally agree with a streamlined identification of images generated by an AI prompt. But, to label an image with “made with AI” metadata when the image is original, taken by a human, and simply used AI tools to edit is absolutely misleading and the language can create confusion. It is not fair to the individual who has created the original work without the use if generative AI. I simply propose revising the language to create distinction.

      • Cryophilia@lemmy.world
        link
        fedilink
        arrow-up
        2
        ·
        6 months ago

        Where I live, is very difficult to get permits to knock down an old building and build a new one. So, builders will “renovate” by knocking down everything but a single wall and then building a new structure around it.

        I can imagine people using that to get around the “made with ai” label. I just touched it up!

        • parody@lemmings.worldOP
          link
          fedilink
          arrow-up
          1
          ·
          6 months ago

          It’s like they’re ignoring the pixel I captured in the bottom left!

          Really interesting analogy.

          Also I imagine most anybody who gets a photo labeled will find a trick before making their next post. Copy the final image to a new PSD… print and scan for the less technically inclined… heh

      • Skull giver@popplesburger.hilciferous.nl
        link
        fedilink
        arrow-up
        1
        ·
        6 months ago

        The edits are what makes it made with AI. The original work obviously isn’t.

        If you’re in-painting areas of an image with generative AI (“context aware” fill), you’ve used AI to create an image.

        People are coming up with rather arbitrary distinctions between what is and isn’t AI. Midjourney’s output is clearly AI, and a drawing obviously isn’t, but neither is very post-worthy. Things quickly get muddy when you start editing.

        The people upset over this have been using AI for years and nobody cared. Now photographers are at risk of being replaced by an advanced version of the context aware fill they’ve been using themselves. This puts them in the difficult spot of wanting not to be replaced by AI (obviously) but also not wanting to have their AI use be detectable.

        The debate isn’t new; photo editors had this problem years ago when computers started replacing manual editing, artists had this problem when computer aided drawing (drawing tablets and such) started becoming affordable, and this is just the next step of the process.

        Personally, I would love it if this feature would also be extended to “manual” editing. Add a nice little “this image has been altered” marker on any edited photographs, and call out any filters used to beautify selfies while we’re at it.

        I don’t think the problem is that AI edited images are being marked, the problem AI that AI generated pictures and manually edited pictures aren’t.

  • Björn Tantau@swg-empire.de
    link
    fedilink
    arrow-up
    4
    ·
    6 months ago

    I think every touch up besides color correction and cropping should be labeled as “photoshopped”. And any usage of AI should be labeled as “Made with AI” because it cannot show which parts are real and which are not.

    Besides, this is totally a skill issue. Removing this metadata is trivial.

      • hperrin@lemmy.world
        link
        fedilink
        arrow-up
        3
        ·
        6 months ago

        A lot of photographers will take a photo with the intention of cropping it. Cropping isn’t photoshopping.

          • hperrin@lemmy.world
            link
            fedilink
            arrow-up
            1
            arrow-down
            1
            ·
            edit-2
            6 months ago

            You don’t have to open photoshop to do it. Any basic editing software will include a cropping tool.

              • hperrin@lemmy.world
                link
                fedilink
                arrow-up
                1
                ·
                edit-2
                6 months ago

                There are absolutely different levels of image editing. Color correction, cropping, scale, and rotation are basic enough that I would say they don’t even count as alterations. They’re just correcting what the camera didn’t, and often available in the camera’s built in software. (Fun fact, what the sensor sees is not what it presents you in a jpeg.) Then there are more deceptive levels of editing, like removing or adding objects, altering someone’s appearance, swapping faces from different shots. Those are definitely image alterations, and what most people mean when they say an image is “photoshopped” (and you know that, don’t lie). Then there’s AI, where you’re just generating new information to put into the image. That’s extreme image alteration.

                These all can be done with or without any sort of nefarious intent.

      • IIII@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        6 months ago

        Sure But you could also achieve a similar effect in-camera by zooming in or moving closer to the subject

    • disguy_ovahea@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      edit-2
      6 months ago

      Some of the more advanced color correction tools can drastically change an image. There’s a lot of gray in that line as well.

      • BigPotato@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        6 months ago

        DOD Imagery guidelines state that only color correction can be applied to “make the image appear the same as it was when it was captured” otherwise it must be labeled “DOD illustration” instead of “DOD Imagery”

    • gedaliyah@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      arrow-down
      1
      ·
      6 months ago

      Why label it if it is trivial to avoid the label?

      Doesn’t that mean that bad actors will have additional cover for misise of AI?

  • Hawke@lemmy.world
    link
    fedilink
    arrow-up
    3
    arrow-down
    1
    ·
    6 months ago

    Better title: “Photographers complain when their use of AI is identified as such”

    • CabbageRelish@midwest.social
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      6 months ago

      People are complaining that an advanced fill tool that’s mostly used to remove a smudge or something is automatically marking a full image as an AI creation. As-is if someone actually wants to bypass this “check” all they have to do is strip the image’s metadata before uploading it.

  • kromem@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    6 months ago

    Artists in 2023: “There should be labels on AI modified art!!”

    Artists in 2024: “Wait, not like that…”

      • thedirtyknapkin@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        6 months ago

        no, they just replaced the normal tools with ai-enhanced versions and are labeling everything like that now.

        ai noise reduction should not get this tag.

  • IIII@lemmy.world
    link
    fedilink
    arrow-up
    1
    ·
    6 months ago

    Can’t wait for people to deliberately add the metadata to their image as a meme, such that a legit photograph without any AI used gets the unremovable made with ai tag

  • TastyWheat@lemmy.world
    link
    fedilink
    arrow-up
    3
    arrow-down
    2
    ·
    6 months ago

    Hey guys, I cheated in my exam using AI but I was the one who actually wrote down the answer. Why did I fail?

  • pyre@lemmy.world
    link
    fedilink
    arrow-up
    2
    arrow-down
    1
    ·
    6 months ago

    or… don’t use generative fill. if all you did was remove something, regular methods do more than enough. with generative fill you can just select a part and say now add a polar bear. there’s no way of knowing how much has changed.

    • thedirtyknapkin@lemmy.world
      link
      fedilink
      arrow-up
      2
      ·
      edit-2
      6 months ago

      there’s a lot more than generative fill.

      ai denoise, ai masking, ai image recognition and sorting.

      hell, every phone is using some kind of “ai enhanced” noise reduction by default these days. these are just better versions of existing tools than have been used for decades.

  • WatDabney@sopuli.xyz
    link
    fedilink
    arrow-up
    1
    ·
    edit-2
    6 months ago

    No - I don’t agree that they’re completely different.

    “Made by AI” would be completely different.

    “Made with AI” actually means pretty much the exact same thing as “AI was used in this image” - it’s just that the former lays it out baldly and the latter softens the impact by using indirect language.

    I can certainly see how “photographers” who use AI in their images would tend to prefer the latter, but bluntly, fuck 'em. If they can’t handle the shame of the fact that they did so they should stop doing it - get up off their asses and invest some time and effort into doing it all themselves. And if they can’t manage that, they should stop pretending to be artists.

    • Paradachshund@lemmy.today
      link
      fedilink
      arrow-up
      2
      ·
      6 months ago

      I think it is a bit of an unclear wording personally. “Made with”, despite technically meaning what you’re saying, is often colloquially used to mean “fully created by”. I don’t mind the AI tag, but I do see the photographers point about it implying wholesale generation instead of touchups.

  • harrys_balzac@lemmy.dbzer0.com
    link
    fedilink
    arrow-up
    1
    ·
    6 months ago

    Why many word when few good?

    Seriously though, “AI” itself is misleading but if they want to be ignorant and whiny about it, then they should be labeled just as they are.

    What they really seem to want is an automatic metadata tag that is more along the lines of “a human took this picture and then used ‘AI’ tools to modify it.”

    That may not work because by using Adobe products, the original metadata is being overwritten so Thotagram doesn’t know that a photographer took the original.

    A photographer could actually just type a little explanation (“I took this picture and then used Gen Fill only”) in a plain text document, save it to their desktop, and copy & paste it in.

    But then everyone would know that the image had been modified - which is what they’re trying to avoid. They want everyone to believe that the picture they’re posting is 100% their work.

    • BigPotato@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      6 months ago

      Right? I thought I went crazy when I got to “I just used Generative Fill!” Like, he didn’t just auto adjust the exposure and black levels! C’mon!

  • Ð Greıt Þu̇mpkin@lemm.ee
    link
    fedilink
    arrow-up
    0
    ·
    6 months ago

    I agree pretty heartily with this metadata signing approach to sussing out AI content,

    Create a cert org that verifies that a given piece of creative software properly signs work made with their tools, get eyeballs on the cert so consumers know to look for it, watch and laugh while everyone who can’t get thr cert starts trying to claim they’re being censored because nobody trusts any of their shit anymore.

    Bonus points if you can get the largest social media companies to only accept content that has the signing and have it flag when signs indicate photoshopping or AI work, or removal of another artist’s watermark.

    • Schmeckinger@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      edit-2
      6 months ago

      That simply won’t work, since you could just use a tool to recreate a Ai image 1:1, or extract the signing code and sign whatever you want.

      • Ð Greıt Þu̇mpkin@lemm.ee
        link
        fedilink
        arrow-up
        0
        arrow-down
        1
        ·
        6 months ago

        There are ways to secure signatures to be a problem to recreate, not to mention how the signature can be unique to every piece of media made, meaning a fake can’t be created reliably.

        • Schmeckinger@lemmy.world
          link
          fedilink
          arrow-up
          0
          ·
          6 months ago

          How are you gonna prevent recreating a Ai image pixel by pixel or just importing a Ai image/taking a photo of one.

          • Ð Greıt Þu̇mpkin@lemm.ee
            link
            fedilink
            arrow-up
            0
            arrow-down
            1
            ·
            edit-2
            6 months ago

            Importing and screen capping software can also have the certificate software on and sign it with the metadata of the original file they’re copying, taking a picture of the screen with a separate device or pixel by pixel recreations could in theory get around it, but in practice, people will see at best a camera image being presented as a photoshopped or paintmade image, and at worst, some loser pointing their phone at their laptop to try and pass off something dishonestly. Pixel by pixel recreations, again, software can be given the metadata stamp, and if sites refuse to accept non stamped content, going pixel by pixel on unvetted software will just leave you with a neat png file for your trouble, and doing it manually, yeah if someone’s going through and hand placing squares just to slip a single deep fake picture through, that person’s a state actor and that’s a whole other can of worms.

            ETA: you can also sign the pixel art creation as pixel art based on it being a creation of squares, so that would tip people off in the signature notes of a post.

      • Feathercrown@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        arrow-down
        1
        ·
        6 months ago

        The opposite way could work, though. A label that guarantees the image isn’t [created with AI / digitally edited in specific areas / overall digitally adjusted / edited at all]. I wonder if that’s cryptographically viable? Of course it would have to start at the camera itself to work properly.

  • Fonzie!@ttrpg.network
    link
    fedilink
    arrow-up
    0
    ·
    edit-2
    6 months ago

    The image looks like OP cherry picked some replies in the original thread. I wonder how many artists still want AI assisted art to be flagged as such.

    EDIT The source is also linked under the images. They did leave out all the comments in favour of including AI metadata, but naturally they’re there in the source linked under the images.

    • parody@lemmings.worldOP
      link
      fedilink
      arrow-up
      1
      ·
      6 months ago

      💯

      Absolutely cherry picked. Let us know if you peruse the source:

      Without cherry picking… imagine these will be resized to the point of illegibility: