OpenAI now tries to hide that ChatGPT was trained on copyrighted books, including J.K. Rowling’s Harry Potter series::A new research paper laid out ways in which AI developers should try and avoid showing LLMs have been trained on copyrighted material.

    • TwilightVulpine@lemmy.world
      link
      fedilink
      English
      arrow-up
      18
      arrow-down
      6
      ·
      1 year ago

      You joke but AI advocates seem to forget that people have fundamentally different rights than tools and objects. A photocopier doesn’t get the right to “memorize” and “learn” from a text that a human being does. As much as people may argue that AIs work different, AIs are still not people.

      And if they ever become people, the situation will be much more complicated than whether they can imitate some writer. But we aren’t there yet, even their advocates just uses them as tools.

        • TwilightVulpine@lemmy.world
          link
          fedilink
          English
          arrow-up
          10
          arrow-down
          1
          ·
          edit-2
          1 year ago

          But this falls exactly under what I just said. To say that using Machine Learning to imitate an artist without permission is fine, because humans are allowed to learn to each other, is making the mistake of assigning personhood to the system, that it ought to have the same rights that human beings do. There is a distinction between the rights of humans as opposed to tools, so to say that an AI can’t be trained on someone’s works to replicate their style doesn’t need to apply to people.

          Even if you support that reasoning, that still doesn’t help the writers and artists whose job is threatened by AI models based on their work. That it isn’t an exact reproduction doesn’t change that it relied on using their works to begin with, and it doesn’t change that it serves as a way to undercut them, providing a cheaper replacement for their work. Copyright law as it was, wasn’t envisioned for a world where Machine Learning exists. It doesn’t really solve the problem to say that technically it’s not supposed to cover ideas and styles. The creators will be struggling just the same.

          Either the law will need to emphasize the value of human autorship first, or we will need to go through drastic socioeconomic changes to ensure that these creators will be able to keep creating despite losing market to AI. Otherwise, to simply say that AI gets to do this and change nothing else, will cause enormous damage to all sort of creative careers and wider culture. Even AI will become more limited with less fresh new creators to learn elements from.

      • kmkz_ninja@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        arrow-down
        3
        ·
        1 year ago

        How do you see that as a difference? Tools are extensions of ourselves.

        Restricting the use of LLMs is only restricting people.

        • TwilightVulpine@lemmy.world
          link
          fedilink
          English
          arrow-up
          5
          arrow-down
          1
          ·
          1 year ago

          When we get to the realm of automation and AI, calling tools just an “extension of ourselves” doesn’t make sense.

          Especially not when the people being “extended” by Machine Learning models did not want to be “extended” to begin with.