[ad_1]
More than 250 British celebrities have been victims of deepfake porn, in accordance with a brand new investigation.
Among them is information presenter, Cathy Newman, who mentioned she felt violated on watching digitally altered footage wherein her face was superimposed on to pornography utilizing synthetic intelligence (AI).
Channel 4 aired its investigation on Thursday night and mentioned it did an evaluation of the 5 most visited deepfake web sites and located 255 of the virtually 4,000 well-known people listed have been British, with all however two being ladies.
In her report, Newman watched the deepfake footage of herself and mentioned: “It feels like a violation.
“It just feels really sinister that someone out there who’s put this together, I can’t see them, and they can see this kind of imaginary version of me, this fake version of me.
“You can’t unsee that. That’s something that I’ll keep returning to.
“And just the idea that thousands of women have been manipulated in this way, it feels like an absolutely gross intrusion and violation.
“It’s really disturbing that you can, at a click of a button, find this stuff, and people can make this grotesque parody of reality with absolute ease.”
Channel 4 News mentioned it contacted extra than 40 celebrities for the investigation, all of whom have been unwilling to remark publicly.
The broadcaster additionally mentioned it discovered that extra than 70% of guests arrived at deepfake web sites utilizing search engines like google and yahoo like Google.
Advances in AI have made it simpler to create digitally altered and faux pictures.
Industry specialists have warned of the hazard posed by AI-generated deepfakes and their potential to unfold misinformation, significantly in a yr that can see main elections in lots of nations, together with the UK and the US.
Earlier within the yr, deepfake pictures of pop star Taylor Swift have been posted to X, previously Twitter, and the platform blocked searches linked to the singer after followers lobbied the Elon Musk-owned platform to take motion.
The Online Safety Act makes it a felony offence to share, or threaten to share, a manufactured or deepfake intimate picture or video of one other individual with out his or her consent however it isn’t supposed to criminalise the creation of such deepfake content material.
In its investigation, Channel 4 News claimed probably the most focused people of deepfake pornography are ladies who will not be within the public eye.
Newman spoke to Sophie Parrish, who began a petition earlier than the legislation was modified, after the one that created digitally altered pornography of her was detained by police however didn’t face any additional authorized motion.
She instructed the PA information company in January that she was despatched Facebook messages from an unknown consumer, which included a video of a person masturbating over her and utilizing a shoe to pleasure himself.
“I felt very, I still do, dirty – that’s one of the only ways I can describe it – and I’m very ashamed of the fact that the images are out there,” she mentioned.
Tory MP Caroline Nokes, who’s chairwoman of the Women And Equalities Committee, instructed Channel 4 News: “It’s horrific… this is women being targeted.
“We need to be protecting people from this sort of deepfake imagery that can destroy lives.”
In an announcement to the information channel, a Google spokesperson mentioned: “We understand how distressing this content can be, and we’re committed to building on our existing protections to help people who are affected.
“Under our policies, people can have pages that feature this content and include their likeness removed from Search.
“And while this is a technical challenge for search engines, we’re actively developing additional safeguards on Google Search – including tools to help people protect themselves at scale, along with ranking improvements to address this content broadly.”
Ryan Daniels, from Meta, mentioned in an announcement to the broadcaster: “Meta strictly prohibits child nudity, content that sexualises children, and services offering AI-generated non-consensual nude images.”
[ad_2]
Source hyperlink