in

Beware of Fake pornographic videos in the internet.

Pornographic videos that used new software to replace the original face of an actress with that of a celebrity are being deleted by a service that hosted much of the content.

The creation of such videos has become more common after the release of a free tool earlier this month that made the process relatively simple.

The developer says FakeApp has been downloaded more than 100,000 times. It works by using a machine-learning algorithm to create a computer-generated version of the subject’s face.

To do this it requires several hundred photos of the celebrity in question for analysis and a video clip of the person whose features are to be replaced. The results – known as deepfakes.

Some people have used the technology to create non-pornographic content. These include a video in which the face of Germany’s Chancellor Angela Merkel has been replaced with that of US President Donald Trump, and several in which one movie star’s features have been swapped with those of another.

Many creators uploaded their clips to Gfycat. The service is commonly used to host short videos that are then posted to social website Reddit and elsewhere. Gfycat allows adult content, but began deleting some of the deepfakes earlier this week, a matter that was first reported by the news site Motherboard.

“Our terms of service allow us to remove content that we find objectionable. We are actively removing this content,” Gfycat said in a brief statement.

What do you think?

Written by admin1

Leave a Reply

Your email address will not be published. Required fields are marked *

GIPHY App Key not set. Please check settings

Digital Campaign tip for Politicians.

How 5G can change everything in this Life.