A trial program conducted by Pornhub in collaboration with UK-based child protection organizations aimed to deter users from searching for child abuse material (CSAM) on its website. Whenever CSAM-related terms were searched, a warning message and a chatbot appeared, directing users to support services. The trial reported a significant reduction in CSAM searches and an increase in users seeking help. Despite some limitations in data and complexity, the chatbot showed promise in deterring illegal behavior online. While the trial has ended, the chatbot and warnings remain active on Pornhub’s UK site, with hopes for similar measures across other platforms to create a safer internet environment.

  • root@precious.net
    link
    fedilink
    English
    arrow-up
    10
    ·
    9 months ago

    Spot on. The availability of CSAM was overblown by a well funded special interest group (Exodus Cry). The articles about it were pretty much ghost written by them.

    When you’re the biggest company in porn you’ve got a target on your back. In my opinion they removed all user content to avoid even the appearance of supporting CSAM, not because they were guilty of anything.

    PornHub has been very open about normalizing healthy sexuality for years, while also providing interesting data access for both scientists and the general public.

    “Exodus Cry is an American Christian non-profit advocacy organization seeking the abolition of the legal commercial sex industry, including pornography, strip clubs, and sex work, as well as illegal sex trafficking.[2] It has been described by the New York Daily News,[3] TheWrap,[4] and others as anti-LGBT, with ties to the anti-abortion movement.[5]”

    https://en.wikipedia.org/wiki/Exodus_Cry

    • azertyfun@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      6
      ·
      9 months ago

      They’re the fuckers who almost turned OF into Pinterest as well? Not surprising in retrospect. The crazy thing is how all news outlets ran with the narrative and payment processors are so flaky with adult content. De-platforming sex work shouldn’t be this easy.