Apple’s iOS 15.2 beta update adds child safety features to the Messages app

Apple has started rolling out the iOS 15.2 beta update, which brings one of the child safety features the company announced earlier this year, albeit with slight modifications. The latest update adds a communication security feature to the Messages app that, as the name suggests, aims to keep kids safe online.

The new feature is not enabled by default and one has to activate it manually on the Messages app. Once the feature is enabled, the app can reportedly detect nudity in images sent or received by children. Also, if a nude image is sent to someone, it will automatically be blurred and the child will receive warnings about the content, according to a report. macromores,

The company will reportedly offer resources to contact someone they trust for help. If a child gets a nude image, the app will ask the child not to see the photo. It’s worth noting that when Communications Security was first announced, Apple said that if a child sees a nude image in Messages, parents of children under the age of 13 will have to register for the same. You will get the option to get information.

But, it appears that Apple has removed this notification option as it may pose a risk to children if they are involved in a situation where there is parental violence or abuse. Apple appears to have found a better solution to this and it would help to provide guidance from a trusted adult.

The company says that the Messages app analyzes image attachments to check for nudity in photos and will not affect user privacy as the messages will remain end-to-end encrypted. Apple still won’t have access to Messages.

In addition, Apple announced another security feature a few months back, called Anti-CSAM (Child Sexual Abuse Imagery Detection). This is separate from the communication safety feature and is expected to be rolled out in the future.

With this feature, the Cupertino giant aims to trace child sex abuse and trafficking in iCloud Photos. But, the launch of the feature was delayed as Apple said that it addresses complaints filed by privacy advocates earlier. The anti-CSAM feature is meant to find images of child sexual abuse by scanning a user’s iCloud Photos against a list of known CSAMs, which raised privacy concerns. If the feature detects enough matches, it will alert Apple’s moderators, who can then disable the account and report the images to legal authorities.

,

Leave a Reply

Your email address will not be published. Required fields are marked *