Deepfake software has becoming increasingly sophisticated and easy to use, which means nearly anyone can now create explicit material using someone’s image.
Young Australian Noelle Martin found this out to her horror when she discovered explicit images of her face on the bodies of adult film actresses online. She had never had a boyfriend, never taken a nude picture, and was just 18 years old.
Although she had all the emotions you would expect of someone targeted by image-based abuse, she also made a commitment to advocating for new laws to punish perpetrators and to make it easier to have images and video removed.
Five years on, and Noelle is still a target of IBA, as well as of trolling and hate comments. None of this has silenced her, forced her offline or made her give up the advocacy that saw her named WA Young Australian of the Year in 2018.
Unfortunately there is no technical fix for deepfakes, as detection software cannot keep pace with the developing technology. Legal changes are also only a partial solution, as many perpetrators cannot be identified and it can take years to have materials taken down, if at all.
According to president of the cybersafety initiative Mary Anne Franks, what must be addressed is the root cause of men’s desire to sexually objectify women without their consent.
Read the full article by Kristy Melville at: The insidious rise of deepfake porn videos — and one woman who won’t be silenced
Feature image Source: Pixabay