- cross-posted to:
- technology@lemmy.world
- cross-posted to:
- technology@lemmy.world
shared via https://feddit.de/post/2805371
Each of these reads like an extremely horny and angry man yelling their basest desires at Pornhub’s search function.
shared via https://feddit.de/post/2805371
Each of these reads like an extremely horny and angry man yelling their basest desires at Pornhub’s search function.
Sure they do, but if they have to consume would you rather a real child had to suffer for that or just an Ai generated one?
Neither. I would have mental health supports that are accessible to them.
Of course we don’t want both, but it comes across as if you’re dismissing a possible direction to a solution to the one that is definitely worse (real life suffering) by a purely emotional knee jerk.
Mental health support is available and real CSAM is still being generated. I’d suggest we look into both options; advancing ways therapists can help and perhaps at least have an open discussion about these sensitive solutions that might feel counter-intuitive at first.