Deep Fake Reality

Technology is a fun topic to discuss. Optimists look forward to a singularity occurring in their lifetime. Skeptics read up on their Kaczynski and warn of its increasing danger. It allows us to manipulate nature, and in the framework of Heidegger man himself is in danger of being nothing but standing-reserve just as much as the trees of a forest. It is increasingly hard to brush off such accusations in our era of phone screen watchers. Technology now seems to be focused on information clearing, content creation and entertainment, which is just the manipulation of people’s attention. Our minds and its consumption decision making process are the resources to be utilized.

It makes sense now that all new tech would be applied to entertainment delivery, and within those fields would be exploited by the world of sex work. Porn sites are the top sites for web activity and web rankings worldwide. It was several years ago when I first saw an article on Zero Hedge about deep fakes. It was the idea of a video being manipulated of a figure to say whatever a programmer wanted it to say, and to make it indistinguishable from the real thing to a viewer. Seemed a little goofy and glitchy. Then I saw an article about a Gal Gadot deep faked pornographic video clip.

That stuck much more with me because my social circle discussed it. It was not long before a friend texted me an Emma Watson deep fake. Then the internet community whether reddit or porn sites themselves started to ban the posting of deep fake pornographic videos. These are platforms that play loose and fast with banning porn. They push all forms of degeneracy out there. This was an unusual exception for a perfectly legal type of film (ed: until 2020 when George Floyd’s porn film became banned upon upload). The public rationale was that it did not involve consent, but the private rationale is that deep fakes would destroy the industry.

This is witnessed right now in 2020 as the sites with growing users are deep fake sites. Only fans has been the story of 2020, because the porn industry itself is not filming. This opens the door to deep fakes, which is just recycling the mountain of previously filmed footage with mapped, new faces. Thirst is conserved, never destroyed. Deep fakes fill a gap because the tech is getting better, and at its heart, it provides what users really want: the fantasy of a celebrity dream girl in flagrante delicto. The Titanic and Citizen Kane of that world in sales and downloads would be the Pam Anderson and Paris Hilton tapes. Deep fakes allow a Pam Anderson and Paris Hilton tape on demand without grainy footage or night vision. Forget those two, how about Lily the AT&T ad girl?

Now some are ridiculous. There are terrible mappings of celebrities onto faces that are shaped differently, revealing the blurred mapping edges. There are terrible model selections where a body looks nothing like the celeb. Sometimes the magic happens, a creator picks just the right face shape and just the right body type, and then you’re watching Katy Perry make the implications of her terrible lyrics real. The cost and power needed to generate these are hurdles, but those hurdles move lower and lower. There are apps that can do this with gifs now right from your phone. A gif, a pic of anyone and voila, you’re making magic. Combined with nascent VR programs, and it becomes much more dangerous for sating the desires that anyone has involving anyone else within their social circle.

Stepping out of the land of filth, this spreads with dangerous possibilities once in the hands of everyone. In this woke hysteria, what would stop wokesters from creating deep fakes of classmates or coworkers using bad words? Even if it is just individuals competing for the increasingly dwindling spots in the good life industries, the tool is going to be out there. How long would a corporate commissar wait to fire someone? We have witnessed people lose their jobs for being related to this week’s Emmanuel Goldstein. The job termination may even stick because corporate policies now include grey wording of creating a climate or feeling of racism, sexism, etc. Even if found to be untrue, if enough employees have seen the deep fake, would it not create a climate of unease? Bob never said X, but people could believe him to say it. This is the problem of defining –isms down to nothing in order to keep lawsuits away.

The macro effects are even more dangerous. If we consider the recent circus of President Trump going to Walter Reed for care, what were the possibilities involving deep fakes had he stayed in longer and needed more urgent care? How difficult would it be to gin up a deep fake of Trump sleeping in a bed with tubes out of him at night? The First Lady had private phone calls leaked. How hard is it to imagine a nurse surreptitiously filming President Trump at night looking terrible in bed? This could all be filmed and set up easily and then have a Trump face mapped to the actor in a bed. People would want to believe it. Minneapolis saw an afternoon of looting due to a rumor of a black male shot by police despite the male shooting himself and having it caught on video. Even if caught and proven false within a day, what type of damage could that do? This makes the staged videos of Al Qaeda or ISIS even easier to create and stage.

The technology is there and the goal is to get our minds to make a decision or support a policy that creators will want. False flags, war, moral panics, et cetera are all easier to start or inflame with the right, targeted propaganda. It should not surprise us if the technology we laugh at currently becomes a weapon to use against us in a heightened state of tension. The technology is there to detect a fake, but the detection takes time. To paraphrase an old quote on reality, a lie can make it to half the phones in the world before the truth even boots up.

One Comment Add yours

  1. Gepid says:

    On the contrary, deep fake is one of the best things that could ever happen to dissidents. Screenshot of post? Fake, he used inspect element. Photo? Fake, totally photoshopped. Video? Nope, that’s deep fake.

    As deep fake becomes easier, better, and more common and as more and more high profile individuals are targeted by it, it will lead to a world where you can literally say or do anything on camera and still maintain plausible deniability. Sure, someone will always be able to put a lot of effort into analyzing a given video and determining its veracity. But for some random guy being targeted by antifa, will his boss really not believe that the video is probably deepfaked if he’s watching true-to-life Emma Watson DVDA porn every night?

    Further, when someone is inevitably fired for a deepfaked video, they’ll have a shoe-in wrongful termination lawsuit.

    For bien-pensant HR and those seeking to film and expose dissidents, I can’t think of anything worse that the proliferation of deep fake technology.

    Liked by 1 person

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s