Video Expert Believes Deepfake Porn Will Be Easy to Create Very Soon

Last updated August 26, 2019
Written by:
Bill Toulas
Bill Toulas
Infosec Writer

As Shamir Alibhai told the Daily Star today, we are very close to the day when making deepfake porn will be as easy as using Instagram filters. The video technology and AI expert from San Francisco who is the founder and CEO of “Simon Says” and “Amber Video”, believe that we are soon to enter an age where distinguishing between real and fake videos will be impossible, and the sheer number of fake content will make things completely chaotic. The key to the creation of fake videos on such unprecedented scales is the development of the relevant technology, which has almost reached adequate levels.

We saw the first signs of this with the “DeepNude” tool that was released in June 2019 and took the internet by storm. The application was offered for only $50, or for free and with a watermark on the generated images. It used generative adversarial networks to remove clothing from images of woman, and it was technically simple and not released by a tech giant or anything. This highlighted the fact that we’re no longer talking about exotic technology, but something tangible. The reaction of the internet was pretty frightening for the creators of “DeepNude”, so they retracted the tool on June 27 and refunded their premium subscribers, claiming that the world just wasn't ready for their app yet.

However, the message was clear for everyone. As Mr. Alibhai states, we are bound to start seeing fake footage of politicians in efforts of trying to create damaging scandals, or the targeting of regular people who would get extorted by malicious actors using fake evidence. The consequences according to the expert would be dire, with the society turning cynical and the trust to others getting fundamentally shaken.

As Alibhai points out, the only way to fight this would be video authentication verification tools which would also be based on AI technology, but this will surely turn into a cat and mouse game very quickly. The more advanced the deepfake creation tools get, the harder their authentication will be, and the vicious circle will go on with unpredictable results. The expert believes that what we are most likely to see first is fake audio going mainstream, as this is easier to do right now, and it’s more than enough in most cases of human deception. Next will come to the manipulation of existing video/audio such as CCTV and police bodycam recordings. Finally, we will get fully crafted deepfake videos that will require minimal inputs from their creators. If you want to learn more about how deepfakes are created, and how they can be abused, check out this piece.

Are you worried about deepfake porn, or do you believe that we won’t be seeing it any time soon? Let us know what you think in the comments down below, or on our socials, on Facebook and Twitter.



For a better user experience we recommend using a more modern browser. We support the latest version of the following browsers: For a better user experience we recommend using the latest version of the following browsers: