When Apple first revealed its child protection features, they were met with a fairly critical response, resulting in a delay of the planned roll-out. The biggest privacy concern—Apple scanning iCloud photos for Child Sexual Abuse Material (CSAM)—is still on hold, but according to Bloomberg, the Messages update is slated for release with iOS 15.2. Apple says it won’t be on by default, however, and that image analysis will be happening on-device, so it won’t have access to potentially sensitive materials. According to Apple, once enabled, the feature will use on-device machine learning to detect whether sent or received photos in Messages contain explicit material. This will blur potentially explicit incoming images and warn the child or give them a warning if they’re sending something that might be explicit. These new child safety options for Messages should be available in the upcoming iOS 15.2 update, which is expected to roll sometime this month, according to Macworld.