The dark world of the attention merchants
How social media turned distraction into a business model
I’ve been deep in research lately for a new project about the attention economy: how it works, who it rewards, and what it’s doing to all of us. And the more I dig, the clearer it becomes that this isn’t just a media problem or a tech problem. It’s a political problem. The way we consume content today is directly shaping how we think, who we trust, how we vote, and what we even believe is real. If that sounds dramatic, it’s because the stakes really are that high.
We used to think dystopia would come with soldiers in the street and censors on the airwaves. Instead, it came through YouTube channels and TikTok trends. Orwell warned us about surveillance. Huxley warned us about distraction. Somehow, we got both.
We’re not being forced to conform, rather we’re volunteering to scroll ourselves into submission.
We’re Not the Users. We’re the Product.
Here’s the truth: you’re not just “using” social media. You’re being used by it.
Every time you open your phone, you’re stepping into a battlefield where your attention is the prize. And that battlefield has no interest in informing you. What it wants is to keep you looking. The longer you watch, the more valuable you are. That’s the whole game.
And what wins in that game isn’t truth. It’s emotion. Outrage, fear, obsession. The fastest way to rise in this economy is to provoke, not to explain. It doesn’t matter if something is accurate. It matters if it performs.
Think about the last time you saw a completely calm, nuanced, well-sourced video go viral. Not impossible, but rare. What goes viral is often simplified, emotional, and designed to trigger a reaction. And most of the time, it's not even about politics. It could be an influencer faking a meltdown, or someone sharing made-up drama from their relationship. But the logic behind it is the same: get attention, no matter the cost.
This is the system we’ve all been plugged into.
I’ve Lived This From the Inside
This isn’t theory for me. I run a media business. I’ve spent years studying which titles get clicks, which thumbnails work, how to hold an audience for those first 30 seconds that the algorithm cares about most. I know what happens when you don’t “optimize”—you disappear.
And I’ve felt the shift in myself. You don’t wake up one day and say, “I’m going to chase the algorithm now.” It’s more subtle. A little more punch in the title. A little more reaction in the thumbnail. A story that gets moved to the podcast because you know it won’t do well on YouTube. Eventually, it becomes second nature.
You tell yourself it’s just marketing. That it doesn’t change the message. But it does. If you’re constantly filtering your work through what might perform, over time, your instincts bend toward that. You say things differently. You pick different stories. You skip the ones that matter but won’t “hit.”
That’s how the system trains you.
The Algorithm Doesn’t Just Decide What You See, It Decides Who You Become
People think this only affects content creators. It doesn’t. The algorithm shapes you, too. It tells you what to watch, who to trust, what to fear. And over time, it quietly rewires your beliefs, your behavior, even your sense of reality.
Ever tried having a conversation with someone who lives in a different media universe? You realize quickly—you’re not just disagreeing. You’re speaking entirely different languages. You have different premises, different facts, different emotional realities. And the scariest part? You’re both convinced you’re the sane one.
That’s not an accident. That’s the system working as designed. And when it intersects with politics—especially right-wing propaganda or conspiracy theory ecosystems—the effects are explosive. It's how people end up believing the election was stolen, that vaccines are a scam, or that climate change is made up. Not because they’re stupid. Because they’ve been immersed in a carefully curated information environment designed to feel true.
Add AI to the Mix, and It Gets Even Worse
Now take everything I just said, and accelerate it.
AI doesn’t sleep. It doesn’t get bored. It just gets better at figuring out what you’ll click on, what you’ll react to, what will keep you glued to your screen. Synthetic influencers. Fake headlines. AI-generated videos that tell you exactly what you already believe, in exactly the tone that keeps you hooked.
And most people won’t care whether it’s real. If it feels real, that’s enough.
We’re about to enter a world where entire content channels, political or otherwise, are run by non-human creators. You won’t know who’s behind it. There may be no one behind it. And it won’t matter. The system only cares that you stay engaged.
So What Do We Do?
I’m not here to tell you to quit social media and live in the woods. That’s not realistic, and it wouldn’t fix the problem anyway. But we can start seeing the machine for what it is.
If you understand how your attention is being harvested, start noticing which stories get boosted, which emotions are being triggered, which buttons are being pushed, you can start to regain some control. Not total control. But enough to think critically, to slow down, to make different choices.
This project I’m working on is about naming the system. Once you name it, it’s harder to be manipulated by it. You stop asking, “Why is the world so broken?” and start asking, “Who benefits from keeping me distracted, divided, and enraged?”
It’s not a mystery. It’s a business model. And it’s working.
The first step is to start seeing the system for what it is, and understanding it.
We’re reaching over 100 million people every month across YouTube, podcasts, Substack, and beyond. But algorithms can change. Platforms can fold. And when that happens, this newsletter is how we stay connected.
If you’re not yet a paid subscriber, please consider joining.
If you’re already paid on one platform, consider supporting us on both Substack and our website.
You can subscribe on our website and right here on Substack.
And if you’re really on fire, consider gifting a subscription—we’ve got thousands on our waiting list ready to read, watch, and fight back.
MAGA cancel culture is real—and it’s coming for the press. But they can’t cancel what they don’t control.
Let’s keep building.
—David
PS: Can’t contribute right now? No problem. You can support us for free by subscribing on YouTube, listening to our audio podcast on Spotify or Apple Podcasts, or become a free subscriber to this very Substack. Every bit counts.
As a retired academic, I am glad that I won't have to deal with students cheating themselves out of critical thinking by having ChatGBT do their assignments. I won't have to deal with colleagues doing the same thing with the excuse of publication pressure.
But as a (hopefully moral/ethical) citizen of the world this absolutely terrifies me. It is not "merely" an issue of markets. It is much more fundamental--and much scarier--than that. We can no longer trust in the integrity of data from formerly trustworthy scientists. The "scientists/experts" themselves may not even exist. What is real? Nothing is real?
Coincidentally, I had just been exploring these issues with a retired Dean. This morning I had been sent a notification of a YT video of potential interest allegedly by a retired professor from somewhereorother in the Divided States of America. Something seemed a bit off. The voice was a bit weird. Emphasis sometimes placed on the wrong word. Lip-synching sometimes a bit off. So I probed deeper into the platform and found a disclaimer: it was ALL FAKE! ALL AI!! I guess that legally they/it can get away with this just by planting a disclaimer that 98% of the consumers will never see.
But it got me to thinking a bit more critically about what precious time I may be wasting giving any attention at all to what an AI algorithm is being used to control my perceptions of reality.
So ... my traditional sign-off: STAY SAFE ... but now with the additional request: STAY REAL. PLEASE.
❤️🇨🇦
Some of my favorite young liberal podcasters post behind headlines right out of 1960s grocery store tabloids and almost never does the content deliver. (How many times can Trump's collapse be announced even when in reality he didn't?) They may or may not have say-so as to titles, but it's to the point where I ask, "How stupid do they think we are?" I still send them (and you, David) money because fighting back is critical, though I wish they didn't bow to algorithms just to be heard. I find it a major turn-off, those incendiary titles.