The false images of hurricanes generated by AI can have real consequences.
Deceptive artificial intelligence not only seeks to confuse you but could also be trying to scam you.
Recent research has indicated that AI-generated images and videos related to Hurricane Milton could have more serious implications than simply being misleading content on social media. Following the passage of this category 3 hurricane, which wreaked havoc in parts of Florida, platforms like X, TikTok, and Facebook have been flooded with this type of content. Some of these videos and images are easily recognizable, such as the one where a girl hugs an alligator in a boat during the rain. However, others are more sophisticated, like the AI-generated images showing a flooded Disney World, which deceived many people and were even shared by a Russian propaganda outlet.
The phenomenon of AI-generated material is not only generating misinformation but also opportunities for scams. In various digital spaces, the image of Disney World's underwater roads is already well known. TikTok, for example, offers numerous videos showing Hurricane Milton causing destruction, some of which claim to be AI-generated. Although some videos carry labels indicating this, it is easy to imagine another user sharing them out of context. According to Karen Panetta, a professor of electrical engineering and computer science at Tufts University, AI-generated content can be used to incite panic and confusion among the public. She explains that "less than 30% of the adult population understands what AI can do," which facilitates the spread of misinformation.
It is essential to clarify that the actual damage caused by Hurricane Milton is devastating, but the emergence of AI-generated content complicates the perception of reality and may foster conspiracy theories surrounding natural disasters. A user on X, for example, questioned the authenticity of material taken by astronaut Matthew Dominick, suggesting that he was not in space. This material has been verified as genuine. However, some of the satellites that claimed to show images of Hurricane Milton were debunked as mislabelled or AI-generated imitations.
On the other hand, experts warn about the vigilance needed against potential scams that utilize AI-generated images. The Federal Trade Commission issued a warning a day before the hurricane's passage, alerting about the potential exploitation of natural disasters for committing fraud or price gouging. Panetta emphasizes how scammers can benefit from AI to make their tactics more persuasive, such as creating illegitimate fundraising websites.
A clear example of this emotional manipulation is a viral image of a crying girl holding a puppy after Hurricane Helene, which seeks to evoke compassion and the willingness to donate. While some internet users may recognize it as fake, many others might not realize it, becoming easy targets for fraud.
AI-generated images, while seemingly harmless, often form part of a broader strategy to deceive people. Panetta mentions that the deception aims to convince potential contributors of its legitimacy before asking for a donation. The vulnerability created by natural disasters like Hurricane Milton makes it all the more crucial to maintain critical thinking and reflect before reacting to any online content. The FTC also recommends being cautious and avoiding payments through transfers, gift cards, or cryptocurrencies, suggesting to seek resources that help prevent scams after a climate emergency.