Where there’s object detection there’s csam detection.
This is not true at all. A model has to be trained to detect specific things. It does not automatically inherit the ability to detect CSAM just because it can detect other objects. The method it previously used for CSAM image detection (perceptual hashing) was killed for bad privacy implementation, and the article specifically notes that
Tsai argues Apple’s approach is even less private than its abandoned CSAM scanning plan “because it applies to non-iCloud photos and uploads information about all photos, not just ones with suspicious neural hashes.”
So even images that the local detection model doesn’t match to CSAM would be being uploaded to their servers.
Apple killed it’s last version in August 2023 because it didn’t respect privacy.
This is not true at all. A model has to be trained to detect specific things. It does not automatically inherit the ability to detect CSAM just because it can detect other objects. The method it previously used for CSAM image detection (perceptual hashing) was killed for bad privacy implementation, and the article specifically notes that
So even images that the local detection model doesn’t match to CSAM would be being uploaded to their servers.
It was also not that good.