Apple expresses concern over the transformation of real photos into 'fantasy' using AI.
It is essential for us to contribute to the dissemination of accurate information.
The upcoming introduction of Apple Intelligence has led the iPhone maker to reflect on the authenticity of photographs. In a recent talk, Apple’s software chief, Craig Federighi, mentioned that the company is looking to provide AI-powered image editing tools that maintain photographic truthfulness. Federighi emphasized the importance of providing accurate information rather than creating illusions: “Our products, our phones, are widely used.”
The launch of iOS 18.1 brings a new feature called “Clean Up,” which allows users to quickly remove objects and people from images. This option, according to Federighi and journalist Joanna Stern, is more conservative compared to editing tools offered by competitors like Google and Samsung, which can add AI-generated elements to photos. Despite the limitations of Clean Up, he indicated that there were many internal discussions about its inclusion.
“Do we want to make it easier to remove that water bottle or that microphone? Because that water bottle was present when you took the photo,” Federighi reflected during a demonstration of the feature. He noted users' high interest in removing details that, while appearing superfluous, do not fundamentally alter the meaning of the image.
The executive also expressed Apple’s concern regarding the effect that artificial intelligence could have on the perception of photographic content as reliable and representative of reality. Editing tools like Google’s Reimagine function allow users to add unreal elements to images using only text descriptions, which could undermine trust in photography.
Unlike other competing services, Apple Intelligence currently does not allow users to add AI-generated manipulations to images. Images edited with the Clean Up feature will have a label indicating "Modified with Clean Up" in the Photos app and will contain metadata indicating that they have been altered. Apple follows a trend similar to Adobe's Content Authenticity Initiative, which implements a "Content Credentials" system to help differentiate unaltered images from AI-generated forgeries. It remains unclear whether Apple’s metadata system will be compatible with these credentials.