Why AI Porn Is Bad. It’s worse than you think.

Table of Contents

The introduction of generative Artificial Intelligence to every sector of the internet has brought about groundbreaking changes, a lot of them bad – a lot of them very bad.

This article will discuss AI’s influence in the realm of pornography, and the ethical, legal and societal concerns it poses.

Back in May, we covered the heartbreaking reality of AI porn, and how it has ruined lives and images for young women all over the world.

Utilizing Artificial Intelligence in creating pornographic content has created an epidemic of widespread unauthorized use of personal images.

The technology is essentially capable of creating realistically simulated sexual acts using nothing but individuals’ private photos or videos, without their consent. The victims of these cases face deep emotional distress and damage to their personal and professional lives.

What’s worse, is that it is nearly impossible to hold anyone legally accountable for it. Not only is it near impossible to find the source of who prompted the AI to create the simulations it did, but with the current lack of regulations and precedents around Artifical Intelligence – no one really know who to prosecute.

The person who prompted it? The person who took the photos? The people who created the AI?

Regardless, I am sure I don’t need to over-explain to you the severe ethical implications of peoples personal photos being used to create online videos of themselves in the most twisted dystopian sexual harassment cases we could ever imagine. If you do want to read about it, however, I urge you to read more about it here.

AI Porns Exploitation, Legal Challenges and Content moderation

Since May, there has essentially been no slowing down in the dramatic and traumatic spread of AI-generated pornography. It plagues the internet, and although it sometimes does get taken down after a certain time, it’s never truly gone – and the damage it has done will forever leave a scar.

A lot of the online counterarguments in discussion groups on just how damaging AI if for porn, mostly for sympathizers not looking to lose their latest greatest dopamine source, is that “it can just be taken down”.

The back-up for this argument is that back in 2020, Pornhub successfully removed millions of videos from its site that was not specifically verified by the uploader. This was in response to a vast amount of “revenge porn” being released from upset ex’s and sociopathic teenage boys.

The idea being that to get back at their ex, or gain favor with their friend-groups, they would released private videos and photos that were sent to them by their previous SO’s.

Obviously without their consent.

Although Pornhubs cleanse did significantly reduced the amount of non-consensual videos on the site, nothing on the internet is truly gone. This means that even if PornHub and similar sites manages to reduce the amount of AI-generated pornography, it will never be gone.

And that’s not even the real issue here.

The real issue is that it is near impossible to actually moderate AI-generated pornography, because it can be so difficult to know what is real and what is not. The deepfake technology has gone so far that you can even get the individuals in the video (that don’t exist) to “consent”.

Additionally, where is the line between real and fake? A lot of the content creators behind AI-generated porn argues that the people in the videos aren’t actually meant to impersonate anyone – they just “happen” to look similar.

Not even the AI algorithms that sites use are able to distinguish between real, consentually uploaded videos and artificially generated ones.

It is estimated that 4-15% of the internet is dedicated to pornography, and most sites are incredibly lax on its moderation.

In the end, all the sites main goal in the end is always to exploit porn-addicted persons quick need for dopamine fix to generate ad-revenue. With Artificial Intelligence, the ability to create content never before possible is going rampant, and only enhances the ability people have to find exactly what they want.

As a result, we’re seeing an increase in people looking for AI-related pornography, and less content moderation surrounding it.

The Re-wiring of Societal Norms

So far, we’ve only really discussed the issues related to deepfakes, but the problem with AI-generated porn goes much further and much deeper.

There is no hiding that pornography is already incredibly damaging to the human brain. Without going too “anti-porn” on you here, let me really smoothly link to this peer-reviewd study indicating that watching porn has a similar negative effect on your brain as doing cocaine.

For the record, I don’t actually care what you decide to do on your free time, nor am I against sex work. I do, however, take issue with the fact that AI is now turning a basic dopamine addiction issue much, much further by creating new standards for pornography that are beyond damaging.

The industry is now able to go away from portraying people “acting” in an unrealistic manner, to literally impossible things happening. Things that would be immoral, unethical, and dangerous to perform in real life.

That does not mean that watching it isn’t harmful.

Artificial Intelligence is shifting the narrative on what is considered “normal” as people can now not only program AI too look exactly like whoever, regardless of age, but also cater toe the most bizarre and damaging fantatsies out there.

And trust me, there are some truly bizarre, and truly damaging ones.

Videos portraying horrific violence, abuse and death – delving into fetishes that are removing the watchers so far from reality, without their brain realizing it, that real-life consequences are imminent. What’s worse, is that with AI, the age of the “performers” is such a blur that you no longer can differentiate between what is appropriate and what is not.

“Cartoons” sexualizing children has long been an issue on the internet, where the portrayal is clearly a child, but the description goes along the lines of “it’s actually a 9000 year old dragon in human form.”

This is being taken much, much further with generative AI. Tons of AI-generated photos of children have been found, and not on the dark web – the actual web.

The crowd that is for this form of content claim that it’s isn’t immoral because technically, there is no “victim”.

… but there are.

The brain adapts to the content it sees. By consuming content that portrays girls that are clearly supposed to be underage, the brain is training itself to say “this is normal, this is the world today.” Even if you’re not literally thinking it, the neurons in your brain are rewiring themselves into finding a twisted form of acceptance.

This goes for violent portrayals as well. Immoral paraphilias such as necrophilia, incest, pedophilia and sexual violence are now turning into entertainment.

2015’s Westworld already touched on the subject, where you see the guests abusing the AI to no end both violently and sexually. At some point, the brain can no longer differentiate between what is real and what is not – which leads to this harmful brain rewiring causing real shifts in perception affecting real-life relationships and contributing to a broader societal misunderstanding of consent and sexual norms.

It also has to be highlighted that the objectification of people has become much worse than before.

Pornography is already objectifying humans, but AI is taking this to another level by literally turning the portrayal of real people and condensing them into strings of code that the watchers don’t have to “feel bad for”

— how does this affect the minds of young individuals? Young people who’s brains are still easily influenced, and who go on in life struggling to distinguish between real people with real emotions and the imagery they see on screen?

The internet can be a dangerous place, and Artificial Intelligence is not helping.

Please share this article

Leave a Reply

Your email address will not be published. Required fields are marked *

Share this article
Facebook
LinkedIn
Reddit
X
Email