Article Thumbnail

Sex, Lies and Thumbnails: The Enduring Mystery of YouTube’s Secret Porn

There’s an entire underworld of weird porn stuff on YouTube, and it’s up to dedicated groups of online vigilantes to destroy it

In early February, Braden, a pseudonymous high school freshman, was tooling around on YouTube in search of some “weird” videos he’d heard people could find if they searched the site for the term “nudist.” According to the rumor, sorting the results by playlist would bring up some very explicit thumbnails, proving the long-discussed existence of YouTube’s so-called “secret porn.” 

Almost immediately, he found what he was looking for. Clicking on a channel that appeared to be car reviews and interior design ideas, one video in particular caught his eye. Its title advertised a review of interior design options, but its thumbnail suggested something else. In its tiny frame on the right side of his screen, it displayed a zoomed-in photo of a woman’s worried face. She didn’t look like she was in pain, he tells me, but she did look distracted, or even afraid, a stark and bizarre contrast to a video slideshow of living rooms and kitchens. Below were a number of thirsty comments, and a couple of links to porn sites, all of which made him “uneasy.”

Concerned, he posted an “advice needed” thread to the Reddit Bureau of Investigation (RBI), a subreddit for amateur detectives who “use the power of the internet to solve real-world problems.” Having cracked such conundrums as the case of a missing twin and a strange body visible on Google Maps, he figured it would be a good place to get some insight. 

In a post titled “Unsettling YouTube Channel,” he wrote that he’d found a few videos whose thumbnails “don’t line up with the content.” They were from the now-deleted channel Styles and Ideas, a bizarre menagerie of trendy car and kitchen redesign videos whose thumbnails were boob-y women in a variety of suggestive, sensual poses. None of the thumbnail images appeared in the videos themselves, but almost all of the videos on the channel had them, enticing clicks with skin and sketchy-looking situations. 

The weirdest, by far, was a video called “The Best New Cars Coming in 2021 and Beyond REVIEW.” Though it was essentially a slideshow of car photos, its thumbnail was an innocent-looking woman in a well-lit bedroom, her naked legs spread out to the corners of the frame. The image cut off just above her pubic bone, and her face wore a subtle expression of concern. 

Braden found other channels with videos like these, too. Design Ideas, which has also since been deleted, was full of them. One video, “Living Room Decor Tricks for a Standout Space,” featured a pantsless woman making out with a man as his wife watched TV in another room. “Japanese Street Food” had a woman wearing a sheer pink top rinsing a naked guy’s neck with a shower head. “Lawyer Office Interior Design Ideas” was just a blurry photo of a woman’s face, her eyes closed in an inscrutable mix of fear or pleasure. Like all the other channels, every video was set to the same eerie piano song. 

As Braden would soon discover, YouTube is brimming with strange videos, channels and playlists like these. In nearly every case, there’s a blatantly sexual or suggestive thumbnail that directs viewers to a poorly edited video of something mundane like cars, cell phones or medical equipment, which is then cut to stock music, usually from an artist or label called “Thanh Minh.” And despite the fact there’s rarely actual sex or nudity in the thumbnails or videos, their comments sections are absolutely spammed with links to porn and X-rated dating sites, the vast majority of which are from bots or fake accounts. 

But while Braden and many RBI-ers were just discovering these strange thumbnails, other people have been tracking them — and other forms of so-called “YouTube porn” — for years. In 2019, an anonymous writer on Medium created an entire compendium on YouTube’s pornographic underworld, writing that “nobody, including logged-out viewers, viewers in Restricted Mode or even users on YouTube Kids, are safe on their site.” There are tons of Reddit posts, YouTube explainer videos and amateur sleuthing groups who investigate it, too. 

In 2014, someone created one of many Change.org petitions to get YouTube to take action, but so far, it seems to have had little effect. Last March, a YouTube user who was bombed with porn thumbnails after searching for “peliculas completas” (complete films) lamented how difficult it still was to report these videos and have them removed, writing, “Considering those videos are getting literally millions of views, certainly hundreds of annoyed people must have reported those thumbnails, still YouTube reaction is zero, null, nada.”

In fact, the only sex-related videos YouTube does seem to be good at removing are perfectly safe-for-work sex education videos (particularly those created by queer people and people of color) and influencer videos whose content is no more lewd or suggestive than mainstream music videos. In 2019, a number of queer sex educators were shadowbanned and demonetized for unknown reasons, and a handful of high-profile creators like Belle Delphine have been booted off the platform for posting so-called “sexual content.” 

And yet, there are plenty of operational and monetized full-frontal videos up on YouTube whose content violates its community guidelines in much more cut-and-dry ways. YouTube didn’t respond to multiple requests for comment, so it’s unclear how these videos are evading the same skin-sensing algorithms that ban educators and influencers.

The plot thickens with #DiamondGate, another sketchy and mysterious YouTube happening in which young and sometimes underaged girls are reportedly shown performing “lewd” or “provocative” activities like building a Lego set wearing just a bra or playing a Minecraft sequence that suddenly cuts to softcore porn. A response to a series of non-nude softcore videos that were hidden in fake diamond ads on YouTube, the phenomenon attracted the attention of a group of “online vigilantes” who congregated on Reddit and trawled YouTube in search of videos to report. 

#DiamondGate activity appears to have died down, along with the fervor around it on Reddit, but things like #ElsaGate are still going strong. #ElsaGate is a well-known phenomenon in which YouTube thumbnails and videos containing inappropriate and often disturbing content are targeted at kids. Often, though not always, they contain cartoons or other children’s characters like The Joker, Spider-Man or Elsa from Frozen, who are then arranged in adult-themed scenes like cheating spouses, public urination, extreme fetishes and acts of violence and gore. In 2017, one moderator of the #ElsaGate subreddit reportedly clicked through some of these thumbnails and found “videos of children giving handjobs to old men.”

According to the #ElsaGate FAQ on Reddit, YouTube has known about this for some time, too. In 2017, reporting from the New York Times and several other outlets brought the issue to the mainstream, causing YouTube to delist videos and channels, institute new restrictions and increase their moderation capacity. But, as was the case with the porn thumbnails, it’s had little direct effect. Per the #ElsaGate Reddit FAQ, disturbing videos are still being created and uploaded. 

CupcakeMufin413, an anonymous “late teen” who moderates the #ElsaGate subreddit, says that currently, none of its 95,000 subscribers know exactly why this is happening, and so far, no one’s been able to reach any of the creators of the channels in question to ask them what their endgame is. It’s possible that it’s just a fucked-up way to boost clicks and make money, they say, but many people also believe it’s a more insidious plot created by pedophiles to groom children (though, there’s been little convincing evidence for this thus far). 

Tons of other theories about YouTube’s porn thumbnail problem have been raised, too. Like many people, Braden suspects they might come from hacked accounts; meanwhile a few people on RBI suggest they could be students making videos for some sort of design, architecture or marketing class who are just really bad at editing. Nor would it be a stretch to think they’re just elaborate advertisements for porn and X-rated dating sites; by getting bots to leave comments on inoffensively tame videos, they can catch the attention of YouTube’s algorithm, which factors in audience engagement when considering who to recommend videos to and when. And, of course, we can’t rule out some gloating neckbeard sitting in his basement who just thinks it’s funny. 

The most logical theory by far seems to be that all this is nothing more than some ingenious entrepreneur who’s figured out how to game YouTube’s paid ad sponsorships to their advantage. Exhibit A: the “Khang Pham Chanel” channel. A pretty standard porn thumbnail scam account, it has 4,550 subscribers, 10 videos, 3.7 million views and a bunch of masturbating women attracting attention to some “charming, old style phones.” If you input their URL into Influencer Marketing Hub’s YouTube money calculator tool, you can see that they’ve made $7,380 off ads and views since they started their channel in 2016. Given that there are multiple near-identical channels who use the same Thanh Minh stock music they do, it’s not a stretch to think this could just be some internet genius sitting pretty, counting their cash as people freak out over misplaced thumbnails of women masturbating.

It’s not just creators who stand to profit from porn-y thumbnails, though; YouTube (which is owned by Google) also makes money off of ads. As Google CEO Sundar Pichai reported last year, YouTube generated $15.1 billion in ad revenue in 2019, a staggering amount that contributed to roughly 10 percent of Google’s entire income that year. With this in mind, it makes sense why they might not be sharpening their porn and nudity-seeking algorithms as well as they could. (As with YouTube, Google didn’t respond to multiple requests for comment.)

That said, YouTube does make it somewhat easy to report videos with porn-y thumbnails, and they remove them when they can (they even deleted some offending Design Ideas videos as I was studying the channel). But, as of now, there’s no real way to prevent malicious creators from uploading them in the first place. Because the thumbnails don’t often contain nudity, algorithms rarely pick them up, and even if they did, the YouTube video library is too massive to nip every scammy video in the bud. As one #DiamondGate follower put it on Reddit, “YouTube is not able to moderate a gigantic ton of videos. The only thing they really have going for them are the algorithm (which is shit and inefficient) and volunteer reporters (who will misuse the report button a lot).” 

Until YouTube figures out a way to uphold their own community guidelines and make billions off their creators at the same time, it’s up to people like Braden, CupcakeMufin413 and the ever-growing communities of YouTube porn vigilantes to investigate and report the strange sightings they see on the platform. Braden, however, has taken a break from his sleuthing because of class and the rigors of after-school tennis practice. Still, he guesses he’ll be back in the game when he has more time. 

“We’re getting close to something here,” he says. “If YouTube doesn’t get to the bottom of this, I will.”