A trial program conducted by Pornhub in collaboration with UK-based child protection organizations aimed to deter users from searching for child abuse material (CSAM) on its website. Whenever CSAM-related terms were searched, a warning message and a chatbot appeared, directing users to support services. The trial reported a significant reduction in CSAM searches and an increase in users seeking help. Despite some limitations in data and complexity, the chatbot showed promise in deterring illegal behavior online. While the trial has ended, the chatbot and warnings remain active on Pornhub’s UK site, with hopes for similar measures across other platforms to create a safer internet environment.
I was wondering what sort of phrases get that notification but mentioning that mind be a bit counterproductive
I’m not sure if it’s related but as a life-long miniskirt lover I’ve noticed that many sites no longer return results for the term “schoolgirl” and instead you need to search for a “student”
The MLs have been shown to be extraordinarily good at statistically guessing your words. The words covered are probably comprehensive.
I think the other article talks about it being a manually curated list because while ML can get correct words it also gets random stuff, so you need to check it isn’t making spurious connections. It’s pretty interesting how it all works
Id be very curious what these terms are, but I wouldn’t be surprised if “pizza guy” or “school uniform” would trigger a response.
Obviously don’t google this, but IRC one of the terms used was lemon party.
Can you very loosely tell me what that is so I don’t have to google it?
Lemon party was a bunch of old naked dudes sat in a group i think… Mightve been involving themselves with each other? It’s been a fucking loooong ass time since I got shown that and meatspin at school lol
Really?
hahaha… it saddens me that only those >30yrs old may get this.
Hey now, I understood that reference and I’m… only… 27.
30 years draws ever nearer.
Old school