Fake celebrity sex content scheme uncovered

by admin
Fake celebrity sex content scheme uncovered
Fake celebrity sex content scheme uncovered

NBC reports that ‘deepfakes’ of pornographic content have appeared with images of big TikTok stars such as Addison Rae Easterling, Charli D’Amelio or Bella Poarch.

Recently, a supposed image of Addison Rae went viral, generating more than 21 million views in just a few hours, in which the famous TikToker appeared lying on a bed in a seductive manner.

In the photo in question, which was removed from Twitter, Addison Rae‘s face had been added to another woman’s body.

In the following hours, at least nine accounts posting pornographic ‘deepfakes’ of celebrities were discovered and deleted, although Charli D’Amelio‘s fake pornographic content was not deleted.

Previously, Hollywood actresses such as Julia Roberts, Emma Watson and Scarlett Johansson have also been victims of this type of content.

“It’s like feeling violated. That’s how you feel when you see yourself naked against your will and it’s spread all over the internet,” said streamer QTCinderella.

What are deepfakes?

A deepfake is an image manipulated by artificial intelligence that blurs the line between a real image and a fake image, making people believe that the image they are seeing is real.

In the case of sexual deepfakes of celebrities, explicit or pornographic content is shown in manipulated videos or images where the face of one celebrity is superimposed on the body of another.

FBI warns of danger of ‘Sextortion’ with deepfakes

According to the FBI the extortionists want victims to provide money or gift cards in exchange for not sharing the fake images or photos with family or friends on social media, or for victims to provide real sexually-themed images or videos of themselves.

“The FBI continues to receive reports of victims, including minor children and non-consenting adults, whose photos or videos were altered into explicit content.”

“The photos or videos are then publicly distributed on social media or pornographic websites, for the purpose of victim harassment or ‘Sextortion’ (extortion with sexual content) schemes”.

“Although seemingly innocuous when posted or shared, images and videos can provide malicious actors with a plentiful supply of content to exploit for criminal activity. Advances in content creation technology and personal images accessible online present new opportunities for malicious actors to find and target victims. This leaves them vulnerable to embarrassment, harassment, extortion, financial loss or ongoing re-victimisation in the long term.

Source Link

You may also like