- cross-posted to:
- technology@lemmit.online
- cross-posted to:
- technology@lemmit.online
Study shows AI image-generators being trained on explicit photos of children::Hidden inside the foundation of popular artificial intelligence image-generators are thousands of images of child sexual abuse, according to a new report that urges companies to take action to address a harmful flaw in the technology they built
All of our protect the children legislation is typically about inhibiting technology that might be used to cause harm and not about assuring children have access to places of safety, adequate food and comfort, time with and access to parents, freedom to live and play.
Y’know, all those things that help make kids resilient to bullies and challenges of growing up. Once again, we leave our kids cold and hungry in poverty while blaming the next new thing for their misery.
So I call shenanigans. Again.
It’s still abhorrent, but if AI generated images prevent an actual child from being abused…
It’s a nuanced topic for sure.
We need to better understand what causes pedophilic tendencies, so that the environmental, social and genetic factors can someday be removed.
Otherwise children will always be at risk from people who have perverse intentions, whether that person is responsible or not for those intentions.
I don’t think it’ll ever be gotten rid of. At it’s core, pedophilia is a fetish, not functionally different from being into feet. And like some fetishes, it doesn’t mean a person will ever act on it.
I’m sure that many of them hate the fact that they are wired wrong. What really needs to happen is for them to have the ability to seek professional help without worrying about legal repercussions.
Removed by mod