Michael Taifour
12 min readSep 17, 2021

--

Watch it on YouTube: https://youtu.be/wHH4CHkKAe8

Many have long feared this day would arrive, and now it has.

A shocking new artificial intelligence app is said to be able to swap women into porn videos with a click of a button.

There’s nothing new about this. There have been others apps like it. But this one is said to have new features like no other — such as a user-friendly interface. What’s even funnier is that it portrays itself as safe and responsible.

While I was reading about it online, I was shocked, even horrified. “How could this be? how could it happen?” I asked myself. More importantly, how can it be legal?

To my understanding, this site breaches all ethical lines no others like it have ever crossed before. It is said to be incredibly easy to use. Once a user uploads a photo of a face, the site opens up a library of porn videos.

The vast majority feature women and to a much lesser extent gay men, and the user can then select any video to generate a preview of the face-swap within seconds, and even pay to download the full version of the video.

How could this even be legal? And why haven’t governments taken it down yet?

Watch it on YouTube: https://youtu.be/wHH4CHkKAe8

When I tried to investigate the issue, I came to learn that there are several face-swapping apps out there. And that there are several researchers, otherwise known as activists, one of them I believe is here in Australia, who are constantly on the lookout for such sites to shut them down.

According to press reports I’ve read by those same researchers; these sites place users into selected scenes from mainstream movies and pop videos. But what’s special about this new and dedicated pornographic face-swapping app, is that it has taken this shadowy industry to a new, low level, the lowest that it could get.

I’m sure you’re keen on knowing the site’s address or how to download its app. But I’m not here to promote such filth. So, like others who wrote about it, we decided to keep it confidential, at least until it’s taken down.

The industry of deepfakes, otherwise known as AI-generated synthetic media, is not new. Many sites and various apps have been used in the past, and are still being used to this very day. They create pornographic representations of women.

I cannot even fathom, not even for one second, how psychologically devastating this must be for women, who are victims. The original Reddit creator, who popularized the technology, face-swapped female celebrities’ faces into porn videos.

According to the latest estimates, which show how big this industry is, it’s been revealed that more than 90% of all online deepfake videos are non-consensual porn, and around the same percentage of those feature women. And as technology advances, new ones are emerging. They are said to allow users to “strip” the clothes off female bodies in images.

Many of these sites have been forced offline, but this latest site is said to have received over 6.7 million visits last August. This is incredible, nearly 7 million people viewed that porn in one month. And for some reason, I’m unable to understand, that site hasn’t been taken offline, not yet at least.

The problem with this latest unnamed site is that it portrays itself as a safe and responsible virtual place for some people, not all people, to explore their sexual fantasies. Even the literature on the site encourages users to upload their faces. What’s even worse is that they can upload other people’s faces on it too. That’s even way more horrifying.

Till the date of writing this article, no one has been able to take down this site. Not only that, but another site has already popped up that seems to be attempting the same thing.

Here’s the story as told by the MIT Technology Review.

A new AI App that swaps women into porn videos with a click of a button is not only disturbing but it’s also eye-catching. This is mainly due to its simplicity. Against a white backdrop, a giant blue button invites visitors to upload a picture of a face. Below the button, four AI-generated faces allow you to test the service. Above it, the tag line boldly proclaims the purpose… to turn anyone into a porn star by using deepfake technology to swap the person’s face into an adult video. All it requires is a picture and the push of a button.

Although Karen Hao, an AI editor at MIT Technology Review, investigated the story, she chose not to name the app, nor use direct quotes and screenshots of its contents. Her reason for doing that is to avoid driving traffic to the site.

From the beginning, deepfakes, or AI-generated synthetic media, have primarily been used to create pornographic representations of women, who often find this psychologically devastating. The original Reddit creator who popularized the technology face-swapped female celebrities’ faces into porn videos. Research company Sensity AI estimates that, to this day, between 90% and 95% of all online deepfake videos are non-consensual porn, and around 90% of those feature women.

As the technology advances, numerous easy-to-use, no-code tools have also emerged, allowing users to “strip” the clothes off female bodies in images. Many of these services have since been forced offline, but the code still exists in open-source repositories and has continued to resurface in new forms. The latest such site received over 6.7 million visits in August, according to researcher Genevieve Oh, who discovered it. It has yet to be taken offline.

There have been other single-photo face-swapping apps, like ZAO or ReFace, that place users into selected scenes from mainstream movies or pop videos. But as the first dedicated pornographic face-swapping app, this current site takes this to a new level. “It’s “tailor-made” to create pornographic images of people without their consent,” says Adam Dodge, the founder of EndTAB.

According to Hao, this current site is incredibly easy to use. Once a user uploads a photo of a face, the site opens up a library of porn videos. The vast majority feature women. A user can then select any video to generate a preview of the face-swapped result within seconds, and can also pay to download the full version.

Watch it on YouTube: https://youtu.be/wHH4CHkKAe8

According to those who are familiar with the industry, the results are far from being perfect. Many of the face swaps are obvious fakes, with the faces shimmering and distorting as they turn to different angles. But to casual observers, they can easily be fooled by them. And the course that deepfakes has taken is showing us how difficult it is to tell the difference between what’s fake and what’s real.

The quality of the deepfakes doesn’t even matter, since they have the same psychological effect on the victims themselves, says Ms. Hao. And even low-quality face swaps can fool many people. “To this day, I’ve never been successful fully in getting any of the images taken down. Forever, that will be out there. No matter what I do,” says Noelle Martin, an Australian activist.

The problem with this latest unnamed site is that it portrays itself as a safe and responsible virtual place for some people to explore their sexual fantasies. Even the literature on the site encourages users to upload their faces. What’s even worse is that they can upload other people’s faces too.

The consequences for women and teenage girls targeted by such activity are devastating, mainly at the psychological level. These videos can feel violating as revenge porn. Those are real intimate videos filmed or released without consent. “This kind of abuse, where people misrepresent your identity, name, reputation, and alter it in such violating ways, shatters you to the core,” says Noelle Martin, an Australian activist who has been targeted by a deepfake porn campaign.

And the repercussions can stay with victims for life. The images and videos are difficult to remove from the internet, and new material can be created at any time. “It affects your interpersonal relations; it affects you with getting jobs. Every single job interview you ever go for might be brought up. Potential romantic relationships,” Martin says. “To this day, I’ve never been successful fully in getting any of the images taken down. Forever, that will be out there. No matter what I do.”

Sometimes it’s even more complicated than revenge porn. Because the content is not real, women can doubt whether they deserve to feel traumatized and whether they should report it.

Non-consensual deepfake porn can also have economic and career impacts. Rana Ayyub, an Indian journalist who became a victim of a deepfake porn campaign, received such intense online harassment in its aftermath that she had to minimize her online presence and thus the public profile required to do her work. Helen Mort, a UK-based poet, and broadcaster who previously shared her story with MIT Technology Review, said she felt pressure to do the same after discovering that photos of her had been stolen from private social media accounts to create fake nudes.

Her story is heart-shattering.

She woke up one day not believing what she was hearing. There were naked photos of her plastered on a porn site, an acquaintance had told her. But never in her life had she taken or shared intimate photos. Surely there must be some mistake? She thought. But when she finally mustered up the courage to look, she felt frightened and humiliated.

Mort was the victim of a fake pornography campaign. What shocked her most was that the images were based on photos, dated between 2017 and 2019, that had been taken from her private social media accounts, including a Facebook profile she’d deleted. “It makes you feel powerless like you’re being put in your place,” she says. “Punished for being a woman with a public voice of any kind. That’s the best way I can describe it. It’s saying, ‘Look: we can always do this to you.’”

The perpetrator had uploaded these non-intimate images, including holiday and pregnancy photos, and even pictures of her as a teenager, and encouraged other users to edit her face into violent pornographic photos. While some were shoddily Photoshopped, others were chillingly realistic. When she began researching what had happened, she learned a new term… deepfakes, referring to media-generated and manipulated by AI.

The Revenge Porn Helpline funded by the UK government recently received another case from a teacher who lost her job after deepfake pornographic images of her were circulated on social media and brought to her school’s attention, Sophie Mortimer, who manages the service, told MIT Review.

According to Ms. Hao, who broke the story, the option to create deepfake gay porn, though limited, poses an additional threat to men in countries where homosexuality is criminalized. This is the case in 71 jurisdictions globally, 11 of which punish the offense by death.

On August 17, after MIT Technology Review made the third attempt to reach the creator, the site put up a notice on its homepage saying it’s no longer available to new users, but not to existing ones. As of September 12, the notice was still there.

Deepfake porn is ruining women’s lives. But the big question remains, will the law finally ban it.

From what I’ve learned, lawmakers are just starting to pay attention. The police don’t interfere, they’re useless. Victims, who called their precincts, were told that nothing can be done for them. Now, many of them are considering getting off the web entirely, although crucial to their work. But they have no other alternative. Either this or being shamed. What an alternative, what a choice to make. And the problem is, no one knew who did this to them. It could be someone close, it could be a relative, a friend, or a foe, no one knows. But the victims start doubting anyone and everyone. They become paranoid. They even start doubting their husbands or ex-husbands. It is really sad when you begin to doubt your reality.

In some cases, victims even had to change their names, take on a new identity, or completely remove themselves from the internet. Imagine that. Imagine how someone could live fearful of being retraumatized again because at any moment in time their images could resurface and once again ruin their lives.

This is Dante’s Inferno reincarnated.

AI has made it far too easy to make deepfake nudes of any woman. In the context of the pandemic, this trend is even more worrying. Cases have nearly doubled since the start of lockdowns. Although the statistics show that nearly 80% of people don’t even know what the term deepfake means, new research has found that more and more people are becoming acquainted with the technology.

In the US, 46 states have some ban on revenge porn, but only Virginia’s and California’s include faked and deepfaked media. In the UK, revenge porn is banned, but the law doesn’t encompass anything that’s been faked. Beyond that, no other country bans fake non-consensual porn at a national level.

Many governments worldwide hide behind the fact that deepfakes are still not a well-known technology. As we speak, judges at courts of law in some advanced countries are undergoing training to understand what deepfake is. Politicians are also being challenged to understand the scope of the issue. Vice President Kamala Harris is said to be working on a federal ban on revenge porn.

But who can tell when the tide will be turning?

As Sean Higgins once said: “Every day the clock resets. Your wins don’t matter. Your failures don’t matter. Don’t stress on what was, fight for what could be.”

Watch it on YouTube: https://youtu.be/wHH4CHkKAe8

Email me at michael@coacheacademy.com

VIEW MY COACHE PRESENTATION:

https://drive.google.com/file/d/1mLwDOgG_sgzKSpj-SQWpX1bmxLkjq8vV/view?usp=sharing

SUBSCRIBE TO MY CHANNEL FOR DAILY VIDEOS: https://www.youtube.com/channel/UCUFiLjwKvfkIhd9kkhEqVaQ

SEE MY BLOG: https://mtay3141.medium.com/

READ MY PUBLISHED BOOKS:

· The Hidden Temple (2021, Ingram Spark, Australia): https://www.barnesandnoble.com/w/the-hidden-temple-michael-taylor/1136975525?ean=9780646818412

· The Lions and the Wolves (2019, Amazon Kindle Publishing, USA): https://www.amazon.com/dp/B088J2G3YF?ref_=pe_3052080_397514860

MY OTHER LINKS

FACEBOOK: https://www.facebook.com/TheEldorado777

YOUTUBE: https://www.youtube.com/channel/UCUFiLjwKvfkIhd9kkhEqVaQ

LINKEDIN: https://www.linkedin.com/in/majed-taifour-33965b164/

TWITTER: https://twitter.com/MichaelTaifour

PINTEREST: https://www.pinterest.com.au/michaelt7521/_saved/

MEDIUM: https://mtay3141.medium.com/

ABOUT ME

As well as a life coach, I am a Financial Advisor, Investment Analyst, Vlogger, YouTuber, affiliate marketer, public and motivational speaker, business strategist, and founder of the Transcendental Re-engineering technique.

In 1994, I was one of the winners of the Columbia University in New York Pan-Asia Journalism Award, in cooperation with Citicorp. I’m listed in the Marquis Who’s Who in the World Millennium edition. Together with a Masters’ degree in commerce from the University of Sydney, I’m certified by The Coaching Institute in Australia.

EMAIL ME: michael@coacheacademy.com

ABOUT COACHE ACADEMY YOUTUBE CHANNEL

This channel is all about the unknowable. It fathoms everything mysterious. It chases enigmatic and mystifying stories and tries to unravel their hidden and covert secrets. The aim is to learn everything new in the unknown territory, awaken you to the mystery of life, and encourage you not to stop questioning.

So, if you’re into the mysterious and the unknown, then this channel is right up your alley.

Subscribe to it, like it, watch it.

SUBSCRIBE @ https://www.youtube.com/channel/UCUFiLjwKvfkIhd9kkhEqVaQ

ABOUT COACHE

COACHE is the life coaching and business strategizing company, which created the concept of TRANSCENDENTAL RE-ENGINEERING back in 2017. The company’s objective is to assist individuals and business teams to uncover their greater potential for success.

Its main focus is on the thought, knowing that the mind is everything and we are what we think. The TRANSCENDENTAL RE-ENGINEERING technique, that the company teaches, is simple and effortless. It helps the mind transcend to its unbounded essence. Through it, the mind unfolds its potential for unlimited awareness, where everything and anything is possible.

--

--

Michael Taifour

Irrepressible, opinionated, and always politically incorrect, satirist Michael covers the week’s news and features its main events in his own distinct way.