A new safety feature in iMessage encourages children to report explicit images to Apple.
Apple is expanding the security features in its communications.
Apple is implementing a new child safety feature that allows children to report to the company if they receive photos or videos containing nudity. This feature will be part of the Communication Safety feature that they were already using, which scans the device to detect inappropriate content in images or videos sent through Messages, AirDrop, or Contact Poster, and blurs it in real time.
When content with nudity is received, in addition to hiding the image or video, the system presents a pop-up window with several options, such as contacting an adult, accessing help resources, or blocking the sender. This new option, which is currently being tested in Australia in the iOS 18.2 version, will allow users to report any image or video that includes nudity to Apple.
The device will generate a report containing the multimedia files, as well as the exchanged messages before and after the receipt. This report will include the contact information of both accounts, and users will be able to complete a form explaining what happened. With this information, Apple will evaluate the report and may make decisions such as restricting the sending of iMessages or notifying the relevant authorities.
In related news, Google announced this week the expansion of its on-device scanning feature in its Android app, which will include an optional sensitive content warning that blurs images with nudity and provides resources for getting help. This feature will be enabled by default for users under 18 years old.
Apple plans to roll out this new functionality worldwide, although it has not yet specified an exact release date. The company did not immediately respond to requests for comments on this new feature. In 2021, Apple announced a series of features focused on child safety, which included scanning a user’s iCloud Photos library for child sexual abuse material and alerts to parents if their children sent or received explicit photos. However, after backlash from privacy advocates, Apple decided to postpone the features and ultimately abandoned the scanning plan in December 2022.