Apple Takes Baby Steps Toward CSAM Protections
Back in 2021, Apple proposed a new initiative that would scan iOS devices for child sexual abuse material (CSAM), but quickly rolled that back after digital privacy rights groups protested. Apple, it seems, is always more about privacy then protection when it comes to kids.
However, in recent news, the company has stated that it’s launching stronger protections for kids — though not nearly as strong as we’d like. Here’s what it will look like:
- If your child receives or attempts to send photos or videos that might contain nudity, Communication Safety warns them, gives them options to stay safe, and provides helpful resources.
- Communication Safety helps protect your child from viewing or sharing photos or videos that contain nudity.
- If Communication Safety detects that a child receives or is attempting to send this type of photo or video, it blurs the photo or video before your child can view it on their device.
- Communication Safety also provides guidance and age-appropriate resources to help them make a safe choice, including the choice to contact someone that they trust.
This new initiative only helps with the affirmative creation or sending of CSAM — it doesn’t do anything about the existing CSAM that is all over the internet. But, it’s a step in the right direction. Parents will need to turn on Communication Safety from Settings when it’s fully launched when this feature rolls out globally.