I heard a bunch of explanations but most of them seem emotional and aggressive, and while I respect that this is an emotional subject, I can’t really understand opinions that boil down to “theft” and are aggressive about it.

while there are plenty of models that were trained on copyrighted material without consent (which is piracy, not theft but close enough when talking about small businesses or individuals) is there an argument against models that were legally trained? And if so, is it something past the saying that AI art is lifeless?

  • 🐋 Color 🍁 ♀@lemm.ee
    link
    fedilink
    English
    arrow-up
    17
    arrow-down
    1
    ·
    1 day ago

    As an artist who has had her art stolen before for usage in an AI output, being against any and all art theft is the default and perfectly reasonable standpoint for an artist. On some art websites, AI generated images fall under the rule against art theft. This is because AI models scrape artists’ work without their consent, and the output of a prompt is reliant on the amalgamation of the aforementioned scraped artworks. I’ve personally seen some AI images in which the mangled remains of artists’ signatures are still visible.

    The best analogy I can offer to explain why this is theft is that typing in a prompt into an AI image generator is like commissioning an artist to draw something for you, except the artist turns out to be someone who traces people’s art and picks stolen artwork to trace from to match the prompt, and then claiming that it was you who created the image.

  • Susaga@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    31
    arrow-down
    1
    ·
    edit-2
    1 day ago

    AI feels like a Lovecraftian horror to me. It’s trying to look authentic, but it’s wrong on a fundemental level. Nothing’s the right shape, nothing’s the right texture, nothing’s consistent, nothing belongs together… But somehow, nobody else has noticed what should be blatantly obvious! And when you try to point it out, you get a hivemind responding that it’s good actually, and you’re just a luddite.

    But let’s assume AI stops being awful in a technical sense. It’s still awful in a moral sense.

    Artists are poor. That’s a well known sentiment you see a lot and, given how many times I see commission postings, it’s pretty accurate. That artist needs to work to live, and that work is creating art.

    AI is deliberately depriving these artists of work in order to give the AI’s owner a quick, low quality substitute. In some cases, it will copy an artist’s style, so you’re deliberately targetting a specific artist because they’re good at their job. And it’s using the artist’s work in order to replace them.

    • oce 🐆
      link
      fedilink
      arrow-up
      7
      ·
      1 day ago

      Isn’t this point also valid for any kind of automation? Machines removed worked from manual workers, software engineers remove work from manual and office workers since they started, way before LLMs. The point that artists actual love their work could also be made for other people whose work have been automated before.
      I think the real issue is that automation should benefit everyone equally, and not only its owners.

      • WoodScientist@lemmy.world
        link
        fedilink
        arrow-up
        7
        ·
        22 hours ago

        The key in my mind is that this technology cannot work independently. A bucket excavator can replace the work of many people digging by hand. But the operation of the machine truly replaces the laborers. Some hand labor is still required in any excavation, but the machine itself is capable of operating just fine without the workers it is replacing.

        But LLM image generators? They are only possible from the work of artists. They are directly trained off of artists’ work. Even worse, the continued existence of LLMs requires the never-ending continual contribution of humans. When AI image generators are trained off the results from AI image generators, things rapidly generate into literal static. It’s making a copy of a copy. If all art becomes made by LLMs, then the only recent data to train future models will be the output of other LLMs, and the whole thing collapses like a snake devouring its own tail.

        This is also the crucial difference between how image generators and actual artists work. Some will say that how LLMs work is simply the same learning process that humans learn through. The image generator trains off pre-existing art, and so does a human artist, proponents of AI will say.

        But we can see the flaw in this in that real artists do not suffer generational decay. Human artists have trained off the work of other artists, in a chain unbroken since before the rise of civilization. Yes, artists can learn technique and gain inspiration from the work of other artists, but humans are capable of true independent creation. Image generators OTOH are just blindly copying and summarizing the work of others. They have no actual sense of what art is, what makes it good, or what gives it soul. They don’t even have a sense of what makes an image comprehensible. They’re just playing a big blind correlation game of inputs and outputs. And so, if you train one AI off another AI’s output, it decays like making a copy of a copy.

        This is a crucial difference between AI “art” and human art. Human art is an original creation. As such, new art can be endlessly created. AI “art” can only blindly copy. So unless the AI can get continual references from actual real human art, it quickly diverges into uselessness.

        The ditch digger replaced by an excavator has no real means to legally object. They were paid for their previous jobs, and are simply no longer needed. But real human artists and AI? This software is going to be a never-ending vampire on their creative output. It has only been created by stealing their past work, and it will only remain viable if it can continue to steal their work indefinitely into the future.

        • MTK@lemmy.worldOP
          link
          fedilink
          arrow-up
          1
          ·
          edit-2
          2 hours ago

          Wow, thank you, I think this is the first argument that clicked for me.

          But it does raise for me 2 questions:

          • If the technology ever gets to a point where it does not degenerate into static by creating its own feedback loop, would it then be more like an excavator?
          • What if this is the start of a future (understandably a bad start) where you have artist who get paid to train AI models? Kind of like a an engineer that designs a factory
          • WoodScientist@lemmy.world
            link
            fedilink
            arrow-up
            2
            ·
            2 hours ago
            • Can it ever get to the point where it wouldn’t be vulnerable to this? Maybe. But it would require an entirely different AI architecture than anything that any contemporary AI company is working on. All of these transformer-based LLMs are vulnerable to this.

            • That would be fine. That’s what they should have done to train these models in the first place. Instead they’re all built on IP theft. They were just too cheap to do so and chose to build companies based on theft instead. If they hired their own artists to create training data, I would certainly lament the commodification and corporatization of art. But that’s something that’s been happening since long before OpenAI.

            • MTK@lemmy.worldOP
              link
              fedilink
              arrow-up
              1
              ·
              1 hour ago

              Thank you, out of all of these replies I feel like you really hit the nail on the head for me.

      • Susaga@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        14
        ·
        edit-2
        1 day ago

        Technically, yes, but I would argue that this is worse.

        An excavator saves you days of digging a single hole. An assembly line saves you from having to precisely construct a toy. A printer saves you from having to precisely duplicate a sheet of paper. All of this is monotonous and soul-destroying work that people are happy they don’t need to do.

        But you still need to decide where to dig the hole. You still need to design the toy. You still need to fill in the first sheet of paper. All of the work left over is more creatively fulfilling.

        We are now attempting to automate creativity.

    • MTK@lemmy.worldOP
      link
      fedilink
      arrow-up
      1
      ·
      2 hours ago

      Because it means nothing to me. sorry to disappoint but I don’t even understand that argument, I saw plenty of AI images that looked full of life to me, so what does that even mean that it is lifeless? Maybe explain it instead of just being condescending about it.

    • alcoholicorn@lemmy.ml
      link
      fedilink
      arrow-up
      8
      arrow-down
      1
      ·
      1 day ago

      AI art proved beyond a doubt that death of the author was always 99% bullshit justifying media illiteracy. Now that we have art without an author and it is totally void of expression.

      • Susaga@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        12
        ·
        1 day ago

        Death of the author is the idea that reader interpretation matters more than author’s intent, and it’s absolutely fair for media analysis. Sadly, too many people bundle it together with the idea that the author didn’t mean anything at all.

        Heck, “the curtains were blue” applies authorial intent that there was no meaning behind the curtains. The death of the author reading shows that the curtains had a symbolic reason to be blue.

      • Lumidaub@feddit.org
        link
        fedilink
        arrow-up
        8
        ·
        1 day ago

        Who uses the Death of the Author to justify media illiteracy? I think you may be misunderstanding what the term means?

        When people say “the author is dead”, what they mean is that, when interpreting a piece of art, it doesn’t matter what the original artist meant to say with it - for the purpose of the interpretation they are dead and you cannot ask them what they meant.

        It’s always a personal matter what you see in art, any interpretation that makes sense to you is valid, even if it may not be what the artist intended. (That does not mean you can bullshit your way through poem analysis in school, different situation)

        • alcoholicorn@lemmy.ml
          link
          fedilink
          arrow-up
          1
          arrow-down
          1
          ·
          1 day ago

          It’s always a personal matter what you see in art, any interpretation that makes sense to you is valid

          No, the thing that the author was trying to express has far greater validity than whatever the reader makes up. If that wasn’t the case, AI art, where the author lacks any intent, wouldn’t seem so lifeless.

          • Lumidaub@feddit.org
            link
            fedilink
            arrow-up
            3
            ·
            edit-2
            1 day ago

            That presumes you can read the author’s mind. It’s impossible to tell with 100% certainty what an author meant to say. You can make assumptions and some can be more plausible than others and people can agree that one interpretation seems more valid than another but that’s it. When a work of art is released into the world, the author has no authority over its meaning.

            A good artist of course can make certain intentions very obvious and control, to a certain degree, what the recipient feels. That’s what you’re perceiving as missing in AI generated pictures.

    • makingStuffForFun@lemmy.ml
      link
      fedilink
      arrow-up
      8
      arrow-down
      7
      ·
      1 day ago

      I disagree strongly on that argument. I’ve seen many examples of AI generated images that have genuinely made me stop, and shake my head in amazement.

        • makingStuffForFun@lemmy.ml
          link
          fedilink
          arrow-up
          4
          arrow-down
          5
          ·
          edit-2
          1 day ago

          No. I watched a video recently of one of the best figure tutors around. Upset with AI. As he critiqued them, multiple times he struggled to tell if it was AI or not. Now, if one of the top YouTube figure drawing instructors struggled at times to identify the difference in his attack against the tech, I’m pretty comfortable saying that it can absolutely move you.

        • Lumidaub@feddit.org
          link
          fedilink
          arrow-up
          6
          ·
          1 day ago

          The thing, even with human-made art, is that what’s “moving” is highly personal. Maybe accept that their experience is different from yours?

          • alcoholicorn@lemmy.ml
            link
            fedilink
            arrow-up
            3
            arrow-down
            4
            ·
            1 day ago

            Art is a form of communication, to hear that someone can be moved by expressionless AI slop is kinda like hearing someone had an enlightening conversation with a dog.

            Like sure I can imagine someone can interpret a dog’s barks to mean something, but it’s still a bizarre scenario that says more about the person than it does the art.

            • WoodScientist@lemmy.world
              link
              fedilink
              arrow-up
              3
              ·
              22 hours ago

              Some people find religious rapture from seeing the Virgin Mary’s image on a grilled cheese sandwich. The human brain is a strange and wonderful thing.

            • makingStuffForFun@lemmy.ml
              link
              fedilink
              arrow-up
              1
              arrow-down
              1
              ·
              22 hours ago

              When you can’t tell if a machine made it, and it moves you personally, then what invisible metric are you defining, and judging it on?

              • alcoholicorn@lemmy.ml
                link
                fedilink
                arrow-up
                1
                ·
                21 hours ago

                Same metrics anyone judges art by, what it says to them. This is incredibly context dependent.

                Show me the art and if just showing it to someone is insufficient, explain it to me.

  • HelixDab2@lemm.ee
    link
    fedilink
    arrow-up
    15
    arrow-down
    3
    ·
    1 day ago

    Art is largely about feeling and emotion, but you insist on rejecting arguments that are arguing about emotion.

    Interesting.

  • Lucy :3@feddit.org
    link
    fedilink
    arrow-up
    17
    arrow-down
    1
    ·
    edit-2
    1 day ago

    From an artists view, it basically makes them obsolete. Sucks. Also, legally trained AI has a lot less training data, therefore worse output and so illegal models will always be preferred.

    From a tech view, AI does not create anything new. It remixes. If we remove artists, which will happen as AIs are simply cheaper, we won’t have anything new. From there on, you can imagine it like that: An artist creates images that are 99-100% of what the goal was, dictated by clients or digitally identified by tags, due to logic, reason, creativity and communication. And they only get better. With AIs, they have like 90% accuracy, due to technical limitations. And once a generated image, which only has 90% accuracy, is used as training data for new images, it only gets worse.

    For example, if there are enough images with 6 fingers, created by AI, in training data, that will become the norm.

    Basically, authors, artists etc. will be obsolete for a few years, until the AI bubble mostly collapses and quality is so bad that companies and individuals hire professionals again. Then AIs will be used for low-requirement things only again, eg. private memes or roleplay.

    So artists are probably angry because they are replaced by much inferior things, that leeched off of themselves and will be gone in a few years anyway. AI just does not make sense, in most cases.

    • Valmond@lemmy.world
      link
      fedilink
      arrow-up
      5
      arrow-down
      1
      ·
      1 day ago

      Artists are not becoming obsolete, that is just wrong.

      I haven’t seen an AI make an convincing oil painting yet :-)

      I think what most people think of as “artists” is actually the job they sometimes do, like layout and graphic design etc. That isn’t going obsolete either, it’s just new tools to help, and maybe the demand will be lowerbecause of it.

      • Lucy :3@feddit.org
        link
        fedilink
        arrow-up
        6
        ·
        1 day ago

        Physical artists won’t, especially those doing plastic art. Most modern art is now digital though, contracted for various things, professionally and privately.

        And for oil paintings, AI creators are going to find a way. This is capitalism after all.

        And with new tools for design, either you’ll be just replaced entirely or you’ll get paid a lot less because “you just ask ChatGPT” or “I could do that with tool X for free”.

        • Lumidaub@feddit.org
          link
          fedilink
          arrow-up
          3
          ·
          1 day ago

          Physical artists won’t, especially those doing plastic art.

          Why would they be safe with 3D printers being a thing?

          • Lucy :3@feddit.org
            link
            fedilink
            arrow-up
            2
            ·
            1 day ago

            That’s kind of its own category of art: designing 3D-Printed stuff.

            I mean stuff like cutting wood or doing something out of bricks etc.

            • Lumidaub@feddit.org
              link
              fedilink
              arrow-up
              4
              ·
              1 day ago

              What difference does the medium make? The people who think AI pictures are good enough or even better than art made by humans will be perfectly fine with generating 3D models and printing them if they want any kind of sculpture.

          • Valmond@lemmy.world
            link
            fedilink
            arrow-up
            1
            ·
            1 day ago

            I think he meant painting and the like when saying “plastic arts”, not doing art with plastic.

            Or so I guess.

            • Lumidaub@feddit.org
              link
              fedilink
              arrow-up
              2
              ·
              1 day ago

              Plastic arts is sculptures, three dimensional things like statues. Nothing to do with plastic, the material. It just so happens that 3D printing is a type of plastic art that uses types of plastic as its medium.

                • Lumidaub@feddit.org
                  link
                  fedilink
                  arrow-up
                  1
                  ·
                  1 day ago

                  If that’s the case, it’s a language barrier thing. The equivalent to “plastic art” in my native language excludes paintings.

      • PonyOfWar@pawb.social
        link
        fedilink
        arrow-up
        4
        ·
        1 day ago

        I haven’t seen an AI make an convincing oil painting yet :-)

        Maybe not for you, but search for oil painting prints on amazon and you’ll find tons of AI generated stuff. The average Joe already can’t tell the difference.

    • makingStuffForFun@lemmy.ml
      link
      fedilink
      arrow-up
      4
      arrow-down
      3
      ·
      1 day ago

      This response assumes an artist wants to be a professional artist, that wants to make a living from it. There are MANY artists, that have no interest of turning their source of joy, into a source of income, and all that comes with it.

      • Lumidaub@feddit.org
        link
        fedilink
        arrow-up
        7
        ·
        1 day ago

        Exactly. I have no intention of selling my art and I object strongly to it being used by some company for their own profit. That’s mine, wtf makes them think they can use it, regardless of its current monetisation status?

  • Libb
    link
    fedilink
    English
    arrow-up
    12
    arrow-down
    1
    ·
    edit-2
    1 day ago

    I heard a bunch of explanations but most of them seem emotional and aggressive, and while I respect that this is an emotional subject, I can’t really understand opinions that boil down to “theft” and are aggressive about it.

    Why the anger?

    How do you earn a living yourself? Or even better, what is your most precious hobby? Whatever it is that you love doing for the love of it (that’s the definition of a hobby) try imagining being told one day, out of the blue: ‘Guys, my fancy but completely soulless computer can do as good as many of you. And it can do it in seconds. Wanna compete?

    Now, imagine it’s your job and not your hobby, the way you earn your living (and pay your rent/mortgage and those always more expensive bills) and imagine being told 'That way you used to earn a living? It’s gone now. It instantly vanished in a magical cloud of 1 and 0s. This AI-thing can do in mere seconds something that would take you weeks and it can do it well enough that quite many of your customers may not want to spend (a lot more) money to pay you for doing the exact same job even if you do it much better. How happy would you feel about that?

    So, yeah, like you said it’s kinda ‘emotional’ topic…

    is there an argument against models that were legally trained?

    Being 100% sure there exists such a database that contains no stolen creation, and then that AIs were indeed restricted to it for their training is already something worth debating and doubting (the second it is not open source), imho.

    There had been a similar problem a few centuries ago, when photography first appeared many painters rightfully considered photography a threat to their business model as one could have their portrait (edit: or have a picture of a landscape) made in mere minutes (it was a longer than that, early days photography was far from being as quick as we know it but you get the idea).

    What happened to them and their practice?

    1. Some painters had to find rich sponsors that were OK to pay in order to get a portrait that would be more unique than a pĥotography (I know what I would prefer between having my photo taken by even a decent photographer or, say, a painted portrait made by Sargent), others found niche domains were to could still earn a living, while others simply went out of business.
    2. Others decided painting could be much more than just being realistic like it (mostly) was before photography became a thing and they quickly started offering us amazing new kind of paintings (impressionism, abstract painting, cubism, expressionism,…)

    And here we are in the XXI century. Painting is still doing fine in its own way (exposed in art galleries and in the home of rich people). There is also a lot more hobbyist painters that will paint all they can including realistic scenes no matter how much ‘better’ a photo could be. They don’t care. Next to those, there are many photographers taking countless photos (many of which being worthless too), some of them trying (and many failing) to earn a living selling them.

    is it something past the saying that AI art is lifeless?

    Maybe it will get better, most probably it will, but so far I feel real sad for people that are unable to see, to feel and to understand how lifeless and how clueless AI art is.

    Edit: typos (yeah, this was handwritten without the help of any AI :p)

    • MTK@lemmy.worldOP
      link
      fedilink
      arrow-up
      1
      ·
      1 hour ago

      I get what you are saying. But does it not sound like the horse farmers when the car came out? It sucks, I don’t blame artists for fighting it and for hating it, but isn’t it inevitable that it will happen to most jobs at some point? I work in cyber security, and it would suck a lot once AI gets good enough to start taking me out of business, but I also accept that it is inevitable and the solution of fighting against technological advances has rarely worked historically.

  • IchNichtenLichten@lemmy.world
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    1
    ·
    1 day ago

    If you worked hard, learned a craft, and spent countless hours honing it and I took your work without asking you and used it to enrich myself and my talentless tech bro buddies, how would you feel?

    • MTK@lemmy.worldOP
      link
      fedilink
      arrow-up
      1
      ·
      1 hour ago

      It would suck, but I wouldn’t blame others for enjoying a service that they perceive as convenient. Of course I would blame you for theft/piracy, as I think artists should against illegally trained models.

  • ℕ𝕖𝕞𝕠@slrpnk.net
    link
    fedilink
    arrow-up
    13
    arrow-down
    4
    ·
    1 day ago

    You can’t understand why people don’t like being stolen from by corporations, and why others don’t want to buy stolen work?

    You can’t understand the difference between digital piracy, humans taking media from corporations for personal use, and the above, corporations taking from humans for commercial use?

  • PonyOfWar@pawb.social
    link
    fedilink
    arrow-up
    10
    arrow-down
    1
    ·
    edit-2
    1 day ago

    For professional artists, AI art is taking away their livelihood. Many of them already lived in precarious conditions in a tough job market before and this is only getting worse now, with companies increasingly relying on cheaper AI art for things like concept art etc.

    For me, as a hobbyist and art consumer, the main issue is AI art invading “my” spaces. I want to look at Human-made art and have no interest in AI-generated content whatsoever. But all the platforms are getting flooded with AI content and all the filters I set to avoid it barely help. Many users on these platforms roleplay as real artists as well and pretend their art isn’t AI, which annoys me quite a bit. I don’t mind if people want to look at AI art, but they should leave me alone with it and don’t force it down my throat.

  • Phen@lemmy.eco.br
    link
    fedilink
    arrow-up
    7
    arrow-down
    1
    ·
    1 day ago

    When a comedian becomes good enough at doing a Stephen Hawking impression, you don’t suddenly expect them to start publishing science studies.

  • kugel7c@feddit.org
    link
    fedilink
    Deutsch
    arrow-up
    3
    ·
    1 day ago

    I have a 1 hour video from a digital artist/ programmer that will tell you essentially why it being lifeless matters.

    Essentialy, everything before AI was either of mechanistic natural beauty, derived from biological chemical, physical processes, like the leaf of a plant the winding of rivers the shape of mountains etc. ,or it was made by human desicions, there was intentionality thought and perseverance behind every sentence you read, every object you held or owned, every depiction you would look at.

    And this made the thing made by humanity inherently understandable as a result of human descion making, creativity, you might not agree with the causes and the outcomes of those decisions, but there was something there to retrace, and this retracing this understanding, made it beautifull, unique or interesting.

    Same with the natural objects and phenomenon, you could retrace their existence to causes, causes that unfold a world in their own right, leading you to ask questions about their existence, their creation, their process.

    In this retracing, these real links to people, to land and to nature lies the real beauty. The life so to say is them being part of this network for you to take a peek into, through their art, their creation, their mere existence.

    Now we have a third category a thing or text or image that exists solely because an imitation machine, an AI is able to crate it, and it can fulfill some profit motive, there is no thought and no intentionality behind this writing this art and so on, it’s a result of statistical models which are built on what existed in the real world, and robs most if not all of these building blocks by just existing. It fills their place, it takes the energy they needed, the intelligence and decision-making they can create, and uses it to replace them, gradually over time.

    And it doesn’t really give back, it doesn’t create value in the sense that we can retrace and understand what it makes, it’s a statistical result, there are no causes to peek into besides pretty boring math, and a collection of data it was trained on, a collection so big and varied that looking at it’s entirety might as well just be looking at everything, it tells us nothing, it doesn’t lead us to ask what there is behind it in the same way.

    https://youtu.be/-opBifFfsMY?si=0yD3BmZSF9ijfGIE

  • sunbrrnslapper@lemmy.world
    link
    fedilink
    arrow-up
    3
    arrow-down
    1
    ·
    1 day ago

    So hear me out… I think AI could be financially very helpful to artists, while giving them a chance to do more meaningful work. Businesses buy a ton of stock photos, graphics and art. An artist could create a library of original digital pieces (they probably already have it) and use that for the source of new AI generated digital content, which in turn would go back into the source library. This reduces the cost/time associated with soulless stock/business content, but positions the artist to maintain a revenue stream. With the extra time, the artist could work on their preferred pieces or be commissioned to do one-offs.

      • sunbrrnslapper@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        1 day ago

        Because it is not as good, doesn’t have a consistent style (needed for branding), and may put the business at risk of law suits. So, buying stock images is preferred.

        • weeeeum@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          1
          ·
          1 day ago

          It doesnt matter if its half the quality if its 1% of the price. Heck, even 0.1% of the price

          • sunbrrnslapper@lemmy.world
            link
            fedilink
            arrow-up
            1
            ·
            1 day ago

            I do a fair amount of stock images purching, and the stance of the businesses I work with is that it isn’t worth the risk of suit and embarrassment to get a slightly cheaper image that isn’t as good. It might not be universally true, but that has been my experience at F500 companies.

            • weeeeum@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              ·
              1 day ago

              There are a lot of local businesses that I could immediately tell had ai images on their website. Smaller shops, that probably also dont know the negative connotation with ai, or just dont care

  • unexposedhazard@discuss.tchncs.de
    link
    fedilink
    arrow-up
    3
    arrow-down
    7
    ·
    1 day ago

    I think as long as all the training data and the results are public and free to use and modify there is no moral problem beyond artist livelihood which is sad but just a part of life. Jobs have come and gone for as long as humans exist, its something we have to accept long term.

    So far artists themselves are still very good at catching even high quality AI pictures tho. AI models produce something that only looks like human art on the surface, but it still misses lots of things. In many cases it wont replace existing art because often the human and the story behind art is what makes people appreciate it.