What happens if you are the victim of a deep fake?

In recent months, the case of Taylor Swift, the well-known singer-songwriter who was the subject of some fake images generated with artificial intelligence and which immediately went viral, has raised further fears about the dangers linked to the use of AI-based tools . 

“Let's consider the case of Taylor Swift, underlines Morgan Wright, Chief Security Advisor of SentinelOne, she is a very rich woman, has an army of lawyers at her disposal and has connections all over the world. Even she had difficulty defending herself from the sexually explicit deep-fake images circulating on social media. Although she managed to block them (it took twenty-four hours to do so), the images continue to circulate on thousands of other sites and the problem increases for those with less financial resources than Taylor Swift."

What happens is that an attacker decides to obtain audio, video and graphic reproductions of the identified target to pursue an objective. Depending on what the purpose is, it could be the combination of any face with that of a second person (usually intertwined with pornography, as it tends to inflict the most damage on the character's reputation), the image, the resulting video or audio playback is circulated on social media and described with gossip-style headlines to attract attention and start the viral phenomenon.

“There are other varieties of deepfakes, continues Wright of SentinelOne, including those of a political, relational, retributive and negative influence nature (manipulation by governments to influence opinions) as the capabilities to generate these harmful reproductions have become easier and cheaper."

What are the first things to do if you fear being a victim of deep fakes?

“The advice is to immediately notify the postal police and file a report, concludes Wright. Furthermore, depending on the type of platform involved, there are some provisions that allow the interested person to submit a complaint to the service provider and report the critical issue. After a deep-fake event, sharing photos, videos, or audio files on social media sites is not recommended. As much as possible, we should remove what we know is in circulation or take accounts offline until the risk can be verified.”

In conclusion, we suggest evaluating the possibility of changing all the privacy settings of social channels in order to limit access to our online content. It won't be easy, but it can help limit future impact.

Subscribe to our newsletter!

What happens if you are the victim of a deep fake?