How the YouTube Algorithm Silently Kills New Creators

And how it’s quietly reshaping what we think, say, and publish

SCIENCE AND TECHNOLOGYMEDIASOCIETYSOCIAL MEDIA

9/23/20253 min read

a red play button sitting on top of a table
a red play button sitting on top of a table

Welcome to YouTube: a place where anyone can have a voice, as long as you don’t say the wrong thing, at the wrong time, to the wrong demographic. It bills itself as the great equaliser, a digital arena where all ideas compete. In reality, it’s a velvet-curtained game show where the judges sit backstage whispering to advertisers.

I recently uploaded a video. It wasn’t particularly scandalous. It wasn’t clickbait or conspiracy. It explored the public reaction to the assassination of Charlie Kirk, specifically the strange dance between denial, deepfake accusations, and the cultural divide that turns every event into a choose-your-own-reality novel. I thought I struck a fair tone. Not preachy. Not provocative. The kind of video that asked questions rather than answered them.

And the stats looked promising. Strong click-through rate. High retention. Solid engagement. But one number didn’t move: impressions. The algorithm never tested it. No push, no sampling, no oxygen. Just silence.

That’s how YouTube punishes you. Not with flags or takedowns, but with absence. The quiet kind of exile where your content sits in a sealed room, completely intact, completely ignored.

Creators talk a lot about views. But views don’t matter if the platform never shows your video in the first place. That early window, where your content is fed to just enough people to see how it performs, that’s the moment where channels live or die. If you’re an established creator, you usually get the benefit of the doubt. If you’re new? One wrong word and you’re done before you’ve even begun.

YouTube doesn’t have to explain why. It just doesn’t test you. No error message, no strike. Just that unmistakable feeling that you made something worth watching, and it disappeared into the algorithm’s memory hole because someone, somewhere, decided the topic was unpalatable.

You start to learn the rules quickly, even though no one tells you what they are. Don’t mention real names. Don’t use current events in titles. Avoid anything with moral weight unless you already have a million subscribers and a standing invitation to sponsor tech gadgets.

And so, creators adapt. They pivot. They post about dopamine detoxes, AI productivity hacks, and vaguely inspiring stories that stop just short of meaning anything. They begin to self-police. Not just their language, but their thoughts, their framing, their instincts. Risk gets trimmed. Edges get sanded. Originality dies quietly under a layer of pastel overlays and royalty-free music.

The tragedy is, they call it strategy. But it’s just learned compliance.

Eventually, entire genres disappear. No one covers the hard stuff anymore unless they already have momentum. The platform stops reflecting the world and starts reshaping it. Reality gets pushed out in favour of whatever’s safest to monetise. Creators become predictable. Content becomes forgettable. And ideas, real ones, get treated like contaminants in an otherwise ad-friendly feed.

What’s left is a platform full of smiling thumbnails and recycled talking points, where controversy is reduced to haircuts and nuance is algorithmically discouraged. The message is clear: speak your truth, but only if it’s profitable. Raise your voice, but only if it can sell something mid-roll.

This isn’t censorship. It’s something more elegant. A behavioural training loop dressed as opportunity. You learn what not to say by watching your best work fail for no visible reason. You abandon certain topics, not because you want to, but because you need to survive. You internalise the algorithm’s preferences until they feel like your own. And then, without even noticing, you stop broadcasting yourself.

You broadcast what the platform wants you to be.

If you want to judge for yourself head over to the video on youtube and you tell me if it was good or bad, i could do with the feedback that isn't machine learned :)