Fake celebrity porn is all over Reddit thanks to a new app

Back in December, it was discovered that Reddit users were creating fake pornography using celebrity faces pasted on to adult film actresses’ bodies. 

The disturbing videos, created by Reddit user deepfakes, look strikingly real as a result of a sophisticated machine learning algorithm, which uses photographs to create human masks that are then overlaid on top of adult film footage. 

Now, AI-assisted porn is spreading all over Reddit, thanks to an easy-to-use app that can be downloaded directly to your desktop computer, according to Motherboard. 

 

Star Wars lead Daisy Ridley has been featured in a fake video on the Reddit thread. One of the site’s users, deepfakeapp, created a desktop application called FakeApp that lets users take adult film footage and swap any female celebrity’s face onto porn actresses’ bodies 

The app, called FakeApp, uses deepfakes’ algorithm, but doesn’t require any knowledge of coding. 

Starlets Emma Watson, Daisy Ridley, Katy Perry and Cara Delevingne have all shown up in doctored videos on the site. 

Reddit user deepfakeapp is behind the new application and told Motherboard that he hopes to build a library of public data that can easily be swapped into any video at any time. 

‘Eventually, I want to improve it to the point where prospective users can simply select a video on their computer, download a neural network correlated to a certain face from a publicly available library, and swap the video with a different face at the press of one button,’ the user told Motherboard. 

Pictured, FakeApp is a new desktop application that lets users face-swap celebrities and porn stars. The app has blown up and now the Reddit thread deepfakes boasts 15,000 subscribers

Pictured, FakeApp is a new desktop application that lets users face-swap celebrities and porn stars. The app has blown up and now the Reddit thread deepfakes boasts 15,000 subscribers

A plethora of high-profile actresses and pop stars have shown up on the Reddit thread, which now boasts 15,000 subscribers.

While FakeApp has made it easier for users to churn out tons of fake AI porn, it’s not without its kinks. 

The app takes eight to 12 hours of processing time to make one short video. 

Some of the fake videos have been pitched as being authentic on other websites, Motherboard noted. 

One FakeApp user created a video of Emma Watson taking a shower, which was picked up on Celebjihad, a site that posts hacked celebrity nudes, Motherboard noted. 

Pictured, Katy Perry's face was swapped onto an adult film actress' body for a short video

Pictured, Katy Perry’s face was swapped onto an adult film actress’ body for a short video

One redditor superimposed Jessica Alba's face on a porn performer's body using FakeApp. A slew of pop stars and actresses have popped up in the deepfakes thread since a user created an easy-to-use desktop application that lets users build their own AI-assisted porn videos

One redditor superimposed Jessica Alba’s face on a porn performer’s body using FakeApp. A slew of pop stars and actresses have popped up in the deepfakes thread since a user created an easy-to-use desktop application that lets users build their own AI-assisted porn videos

FakeApp and the practice of AI-assisted porn have shed a startling light on what could happen when machine learning falls into the wrong hands.

One particularly convincing video showed Wonder Woman star Gal Gadot performing in a short adult film. 

It was made by training a machine learning algorithm on stock photos, Google search images, and YouTube videos of the star – and experts warn the technique is ‘no longer rocket science.’ 

A troubling new video that appears to show Wonder Woman star Gal Gadot performing in a short porn film has shed startling light on what could happen when machine learning falls into the wrong hands

A troubling new video that appears to show Wonder Woman star Gal Gadot performing in a short porn film has shed startling light on what could happen when machine learning falls into the wrong hands

The unsettling video spotted byMotherboard might not fool anyone, but it is a stark reminder of the growing concerns over the ease with which machine learning could be used to create fake porn starring a particular person without their consent, along with other malicious content.

And, it’s not the first.

Deepfakes has made similar videos of other stars, too, including Taylor Swift and Game of Thrones’ Maisie Williams, according to Motherboard, which says it has notified the management companies and publicists of those affected.

The Redditor relied on open-source machine learning tools to create the fake porn videos.

WHY IS FAKE CELEBRITY PORN MADE BY AN AI SO CONCERNING?

Back in December, it was discovered that Reddit users were creating fake pornography using celebrity faces pasted on to adult film actresses’ bodies. 

The disturbing videos, created by Reddit user deepfakes, look strikingly real as a result of a sophisticated machine learning algorithm, which uses photographs to create human masks that are then overlaid on top of adult film footage. 

Now, AI-assisted porn is spreading all over Reddit, thanks to an easy-to-use app that can be downloaded directly to your desktop computer, according to Motherboard. 

The video, created by Reddit user deepfakes, features a woman who takes on the rough likeness of Gadot, with the actor’s face overlaid on another person’s head. A clip from the video is shown

To create the likeness of Gal Gadot, for example, the algorithm was trained on real porn videos and images of actor, allowing it to create an approximation of the actor’s face that can be applied to the moving figure in the video.

As all of this is freely available information, it could be done without that person’s consent. 

And, as Motherboard notes, people today are constantly uploading photos of themselves to various social media platforms, meaning someone could use such a technique to harass someone they know.

The algorithm was trained on real porn videos and images of Gal Gadot, allowing it to create an approximation of the actor’s face that can be applied to the moving figure in the video.

‘I just found a clever way to do face-swap,’ deepfakes told Motherboard.

‘With hundreds of face images, I can easily generate millions of distorted images to train the network.

‘After that if I feed the network someone else’s face, the network will think it’s just another distorted image and try to make it look like the training face.’

The amateur video has worrying implications, showing how freely available resources could be used to create fake films in just a matter of days or even hours.

The video was made by training a machine learning algorithm on stock photos, Google search images, and YouTube videos of the star (pictured above as Wonder Woman) – and experts warn the technique is ‘no longer rocket science’

The video was made by training a machine learning algorithm on stock photos, Google search images, and YouTube videos of the star (pictured above as Wonder Woman) – and experts warn the technique is ‘no longer rocket science’

And, as Motherboard notes, people today are constantly uploading photos of themselves to various social media platforms, meaning someone could use such a technique to harass someone they know.

‘Everyone needs to know just how easy it is to fake images and videos, to the point where we won’t be able to distinguish forgeries in a few months from now,’ AI researcher Alex Champandard told Motherboard.

‘Of course, this was possible for a long time but it would have taken a lot of resources and professionals in visual effects to pull this off.

‘Now it can be done by a single programmer with recent computer hardware.’ 

 



Read more at DailyMail.co.uk