(MENAFN- Live Mint) channel 4 News presenter Cathy Newman has opened up about the distressing experience of discovering deepfake pornography featuring her, describing it as“haunting.” The 50-year-old journalist became her own“case study” in a Channel 4 investigation, which found that at least 250 British celebrities have been targeted by this invasive phenomenon.
Deepfake pornography uses artificial intelligence (AI) to superimpose individuals' faces or bodies onto explicit images or videos without their consent. Newman's ordeal came to light during her appearance on ITV'sGood Morning Britain (GMB) on Tuesday. She shared how colleagues had stumbled upon a deepfake pornographic video of her during their research for the investigation.
Speaking about her reaction, Newman said,“I thought it would be water off a duck's back, given the stories I cover daily. But I found myself returning to the images-it was haunting. The worst par is not knowing who created this video or why.”
Newman's case forms part of a broader Channel 4 investigation, which revealed that women overwhelmingly make up the victims of deepfake pornography. Of the nearly 4,000 famous individuals listed on the analysed websites, 250 were British, with only two being men.
Cally Jane Beech, a former Love Island contestant, also shared her experience on GMB. The 33-year-old recounted being alerted to an explicit, digitally altered image of herself. The photo, originally from an underwear campaign, had been manipulated to remove her clothing.
“I was shocked and didn't know how to feel. I realised the scale of the issue when I spoke out on social media and received a flood of messages from others saying this had happened to them too,” Beech said.
Also Read | South Korea Is Facing Deepfake Porn Crisis
Beech expressed concern for her young daughter after learning that paedophiles have reportedly exploited similar AI technology. However, when she contacted the police, she was told there was little they could do since the image was not real.
The government is now taking steps to address the growing threat posed by deepfake pornography. A new offence targeting the creation and sharing of such images is set to be introduced. This builds on legislation from 2023 aimed at tackling the non-consensual sharing of intimate images, including deepfakes.
Channel 4's investigation also highlighted the role of search engines in facilitating access to these sites, with over 70 per cent of visitors arriving via platforms like Google.
Newman concluded,“It is terrifying to think about the scale of this problem and how it is affecting so many women. We need stronger laws and enforcement to stop this abuse.”
(With inputs from PA_Media)
MENAFN07012025007365015876ID1109063578
Legal Disclaimer:
MENAFN provides the information “as is” without warranty of any kind. We do not accept any responsibility or liability for the accuracy, content, images, videos, licenses, completeness, legality, or reliability of the information contained in this article. If you have any complaints or copyright issues related to this article, kindly contact the provider above.