Police investigation remains open. The photo of one of the minors included a fly; that is the logo of Clothoff, the application that is presumably being used to create the images, which promotes its services with the slogan: “Undress anybody with our free service!”

  • @them@lemmy.world
    link
    fedilink
    English
    4110 months ago

    Yes, lets name the tool in the article so everybody can participate in the abuse

    • RaivoKulli
      link
      fedilink
      English
      3110 months ago

      I doubt it will do much of anything not to name it.

      • DarkThoughts
        link
        fedilink
        1110 months ago

        Considering that AI services typically cost money, especially those advertising adult themes, it kinda does do support the hosters of such services.

        • RaivoKulli
          link
          fedilink
          English
          1210 months ago

          Then again, naming and shaming puts pressure on them too. But in the end I doubt it matters. Those who want to use them will find them.

          • DarkThoughts
            link
            fedilink
            410 months ago

            Of course, which isn’t even the problem but rather people using the edited pictures for things like blackmail or whatever. From a technical standpoint it isn’t too dissimilar to the old photoshopping. Face swapping can probably even provide much higher quality results, especially if you have a lot of source material to pull from (you want like matching angles for an accurate looking result). Those AI drawn bodies often have severe anatomical issues that make them very obvious and look VERY different to their advertisement materials.

          • @30p87@feddit.de
            link
            fedilink
            English
            110 months ago

            True. Especially as just googling ‘undress AI free’ yields tons of results which may be less or more legit.

    • @Rediphile@lemmy.ca
      link
      fedilink
      English
      610 months ago

      You can literally Google ‘AI nude generation tool’ and get multiple results already. And I do sort of agree with you as I’m not sure how naming this specific tool was necessary or beneficial here. But I don’t think not naming it is going to prevent anyone interested in such a tool from finding one. The software/tool itself is (currently) not illegal.