Instagram Viral Sexual Video: A sexual behaviour captured in a viral video shows Inefficiency of Instagram’s content filtering system

Instagram Viral Sexual Video

On X and Instagram, a graphic video featuring two people engaging in intimate intercourse has gone popular. Last week, when browsing Instagram, the story’s writers stumbled onto the reel. It depicts a couple of young Indian males enjoying drinks. At first glance, the video looks normal, like it was taken inadvertently, and it’s not meant to advertise any goods or services. But as the film goes on, it appears that one of them—whose face is constantly blurry—is losing consciousness or just partially so, and the other—whose face is visible throughout—is escorting the unconscious man to a different room where they have intimate sex.

The act, in which the “victim” is either unconscious or barely aware, is shown in the film with their nude bodies and genitalia graphically displayed. The film appears to have been manipulated to resemble a standard pornographic video because it has several cuts.

Here are a couple screen grabs from the movie’s opening segment:

Instagram Viral Sexual Video

Even consenting sexual contact is deemed rape under Section 63 of the Bharatiya Nyay Sanhita if the permission is obtained while the subject is inebriated or incapable. It should be mentioned right away that it is impossible to determine if the video in question was manufactured or actually depicts a rape that was actually committed. However, upon closer examination, we discovered that the individual whose face was visible in the video was an adult entertainer/creator with an account on the pay-for-access content-sharing website “OnlyFans.”

The fact that the video, which violates numerous community guidelines on Instagram, was up for at least six days and received close to seven million views and 2.4 million shares raises serious concerns regarding platform responsibility. The fact that the same individual submitted it three times without the automatic moderation algorithm raising an alarm about it is even more concerning. Even though these have been taken down, other users’ uploads of the video are still live on Instagram.

“We know that there are times when people might want to share nude images that are artistic or creative in nature, but for a variety of reasons, we don’t allow nudity on Instagram,” reads the Instagram community guidelines. This covers images, movies, and some digitally produced stuff that depicts intimate acts, genitalia, and close-ups of completely bare buttocks.
Instagram took six days to remove all three of the videos after we used their in-app reporting feature to report the video.

How Instagram Viral Sexual Video Becomes Viral?

The video is still live and has been uploaded by the same account four times as of the writing of this article. A number of other people have also published it on Instagram; one of those uploads has received 6.7 million views. With just this video on its timeline, a brand-new Instagram account gained nearly 6,800 followers. This specific video was reported by us on July 25.

At the time this piece was published, it had been removed off the platform, but Instagram had not notified us of its removal. Later on, that same account posted something else that sent people to another account where the video was still available. On social media, parodies and sketches based on the disturbing video are currently making the rounds. In addition, screen grabs from the movie have gone viral and became memes.

Additionally, a number of accounts have been made with the user name suffix “Zucc” in order to amplify the video. To increase the number of people who see the video, these accounts are being mentioned in comments on other posts. There’s a great wordplay in the username. When a page gets taken down or disabled in meme culture, it’s called “zucced,” a reference to Mark Zuckerberg. This basically means that Meta’s moderation system was the reason the page was removed. These accounts are frequently used as backup pages for content that Meta is about to delete.

A technique for automatically identifying nude photos and movies was unveiled by Meta in March 2019. An notice stated, “It can be devastating when someone’s private photos are shared without their consent.” When we get reports of non-consensual intimate photographs (also known as revenge porn), we have always removed them to protect the victims. More recently, we have employed photo-matching technology to prevent the re-sharing of these images. We can now proactively identify nearly nude photos or videos that are published without authorization on Facebook and Instagram thanks to machine learning and artificial intelligence. This implies that we are able to locate this content before it is reported.

It is concerning to learn that this specific film was repeatedly uploaded without being reported to the server for being nude. As per Meta’s blog, it additionally employs a technology known as SimSearchNet++, which is an image-matching model taught via self-supervised learning to match picture variations with enhanced recall and high precision. In essence, this technique lessens the dissemination of misleading claims by adding warning labels on duplicates of them. The video’s repeated uploads indicate that Meta’s content control feature is ineffectual, if it is employing a comparable method.

ins4 Instagram Viral Sexual Video: A sexual behaviour captured in a viral video shows Inefficiency of Instagram's content filtering system

Nudity protection has also been added by Meta to Instagram direct messages. It blurs photos that are detected to contain nudity and advises users to reconsider posting nude photos. The function is intended to shield users from scammers and unsolicited nudity in their direct messages. The quantity of shares the aforementioned video indicates that either nudity protection isn’t intended to function on videos in general or it doesn’t perform on this particular video. This clip wasn’t blurry when we sent it to a demo account on Instagram to test the feature. We weren’t given any prompts regarding possible nudity either. This demonstrates that the nudity protection system for photos is ineffective against videos.

Alt News sent a letter to Meta over this. As soon as we have an answer, we will update the story. We also tried contacting the OnlyFans creator multiple times, but they wouldn’t talk to us.

Instagram Viral Sexual Video: Potential for Minor Exposure

This movie was discovered in Instagram’s Reels section. The social media site has an algorithm in place that shows people videos uploaded by accounts they do not follow. More significantly, these algorithms are opaque and can present material to consumers depending on a variety of difficult-to-determine characteristics. 13-year-olds are able to create accounts on Instagram, and in recent years, the social media network has added a number of parental control tools. There’s a chance the video appeared or appears in a minor’s timeline.

It should be mentioned that a 2021 study conducted under contract by the National Commission for Protection of Child Rights discovered that many of the 3,491 school-age participants had profiles on popular social networking apps and websites, with Instagram (used by 45.50%) and Facebook (used by 36.8%) being the most popular.

To learn more about the impact of watching these kinds of movies on minors, Alt News spoke with a psychologist. “Exposure to such videos, portraying hardcore pornography containing abuse, rape, or sexual manipulation or aggression, has a strong influence on adolescents’ sexually permissive attitudes,” stated Ananya Sinha, director and primary clinical psychologist at TherapHeal. They frequently normalize violence and sexual damage. It’s important to keep in mind that pornography is occasionally used as a source of information in addition to being watched for amusement or curiosity. Adolescents who see a video like this on their timeline may begin to believe that this is the standard for sexual intercourse.

However, if someone is unintentionally exposed to these kinds of videos, they could have a profoundly upsetting psychological effect on them. Witnessing such acts of sexual assault can be traumatic, and the ensuing worry and anxiety can last for years. Community norms on social media networks are in place for this very reason. The fact that the footage is available on the site demonstrates their utter incompetence, she continued.

We also had a conversation with a Delhi-based attorney. The woman stated, “It’s illegal for Instagram to not remove a video for days, and they should have a faster turnaround time,”. Compared to most other platforms, their community restrictions are more stringent. The film, which shows a non-consensual scene, ought to have been removed right away. Section 67 of the Information Technology Act of 2000 prohibits publishing or distributing pornographic or sexually explicit content electronically.

Meta Has Made Content Moderation Less Important

Meta has previously faced criticism for removing or deactivating whole accounts of pole dancers and makers of pornographic art at will. Additionally, it blocked a creator’s account because it contained pictures of them nursing their child. It’s also important to remember that over the past year, content moderation has lost priority at Meta, the parent company of Facebook and Instagram. In 2023, it let go of around 200 content moderators. In addition, it was alleged that same year over 100 roles pertaining to trust, honesty, and responsibility were eliminated. The actions were taken at a time when 2024 was set to see around 50 elections that would impact half of the world’s population.

While covering stories on topics including shrimp Jesus AI art, pilfered AI-based photos on Facebook, drug-related Instagram advertisements, credit card theft, hacked accounts, counterfeit money, guns, and films of children engaging in sexual conduct, a number of media sites have brought attention to these. Alt News has also written a great deal on Meta’s poor content moderation, especially when it comes to instances of hate speech and violent depictions.

Instagram was listed as a co-accused in an infraction under Section 12 of the Protection of Children from Sexual Offences (POCSO) Act and Section 67 (B) of the Information Technology (IT) Act of 2000 in a police complaint that was submitted in West Bengal earlier this year. A Meta representative then stated in response to questions from the media that they took action about “content that violates our Community Guidelines or when it violates local law.”

Read More: Best Ways to Download Instagram Reels 

FAQs

Is Instagram Filtering system failed to remove sexual video?

Yes

LEAVE A REPLY

Please enter your comment!
Please enter your name here

This site uses Akismet to reduce spam. Learn how your comment data is processed.

More like this

Helllo Jee Season 1 OTT Release Date: Episode 1...

"Helllo Jee Season 1: Episode 1 - Bahut Dard Saha Maine" is a compelling urban drama that...

The 25 Best Adult Movies You Should Watch Alone...

Top 25 Adult Movies to Watch Alone: All You Need to Know Well matured or mature movies as...

Exclusive: The Top 10 Best 18+ Adult Movies to...

Discover the best 18+ adult movies in 2025, featuring captivating plots, standout performances, and irresistible chemistry. Explore...

Work Breaks for Intimacy: Vladimir Putin has decided to...

Vladimir Putin Facing a continuous decline in population and a sharp drop in birth rates, Russia has introduced...

LATEST NEWS

Aston Villa’s Jhon Duran Set for €77m Move to Al Nassr: Medical Imminent

Aston Villa's Colombian forward, Jhon Duran, is on the verge of completing a €77 million transfer to Saudi Arabian giants Al Nassr. Set to...

iPhone 17’s Dynamic Island Revealed: No Size Change from iPhone 16

Hey there, Apple fans! If you’ve been keeping up with the latest iPhone rumors, you’ve probably heard some buzz about the iPhone 17 lineup....

Virat Kohli’s Triumphant Return to Ranji Trophy: The Legend Comes Home

With cricket fans buzzing and the Arun Jaitley Stadium packed to the rafters, Virat Kohli made an electric return to the Ranji Trophy after...

Why DeepSeek Is Causing a Stir in the AI Industry in 2025?

It took about a month for the finance world to understand the significance of DeepSeek, but when it did, it did so by knocking...

Featured