The Rumor Mill and the Propaganda Machine
The witch hunt after the Brown mass shooting, and why shamelessness pays
Last week, a mob on X falsely accused a college student of mass murder.
The aftermath of the Brown/MIT shootings was a time of high anxiety. The appetite for answers was enormous, and the police hadn’t yet identified a suspect. And in that vacuum, X did what it often does: it produced one. The wrong one.
An undergraduate queer Palestinian activist with they/them pronouns—a villain straight out of MAGA central casting—was falsely floated as the culprit by a large anon account, and then others piled on. The claim was supported by little more than insinuations and the cargo-cult logic of internet witch hunts, where a target’s silence becomes “proof” and the absence of evidence becomes a “cover-up.” There was even “forensics” cosplay: claims of “97.8% gait match,” using a methodology that had collapsed spectacularly for The Blaze weeks before when it wrongly accused a former Capitol Police officer of being the January 6th pipe bomber.
The student, and the falsely-accused officer the Blaze smeared, hopefully have excellent legal teams assessing their options. Because the media—they are the media now—who blasted their names around the internet should have known better. Tim Pool, now part of the Pentagon press corps, amplified suspicion of the student. Laura Loomer, also in the Pentagon press corps, declared that the scrubbing of their Brown university web presence was “a massive coverup to protect the identity of the shooter” while also insinuating that they had a YouTube channel dedicated to making explosives. Harmeet Dhillon, now the Assistant Attorney General for Civil Rights at the United States Department of Justice, joined Representative Anna Paulina Luna in calling the removal of the student’s information “suss“ (normally spelled ”sus,” as in suspicious).
These accusations came hot on the heels of another baseless rumor bandied about by large influencers—Elon Musk included—that the killer had intentionally targeted campus conservatives, sparking calls for violent reprisal against “the left” and the supposed anti-conservative assassin.
Days later, police identified the actual perpetrator: a Portuguese man with no connection to either the falsely-accused student or to the campus conservative victim. Some of the student’s accusers quietly deleted their posts. Others, like Dhillon, acted as if nothing happened. The innocent person at the center of the firestorm was left to pick up the pieces.
But it’s important to understand what happened here. This wasn’t “misinformation”. It’s an issue of online rumors now powering the right’s political reality—up through the highest levels of government.
What a rumor mill is, and why it matters
The rumor mill is what happens when uncertainty meets social participation.
Here’s how it works: Something happens—a natural disaster, an election dispute, a mass shooting. Information is partial. People feel anxious. They want an explanation, a villain, a reason. Rumors are an age-old way of filling that void. They’ve always served a social coordination function: theorizing and spreading rumors binds people together, signals who’s in and who’s out, directs anger, justifies suspicion. Rumors can also be very entertaining. And they sometimes turn out to be true.
But X’s architecture makes rumors ambient. They’re not an occasional byproduct of uncertainty, they’re a pervasive form of content. The platform is constant bait, designed for heat over light, and participation over restraint. There is always some rumor being pushed as a nascent “trend” to some niche group of users that the algorithm thinks will run with it. And when it comes to politics in particular, people don’t just consume rumors; they perform them. Contributing or retweeting commentary is a way to signal membership in an online faction— while simultaneously getting a dopamine hit.
Big political influencer accounts serve as circuit breakers or accelerants, deciding whether something explodes or fizzles: “Big if true!” as a quote retweet can shift tens of thousands of peoples’ attention for an afternoon. Rumors can shape how people make sense of reality—especially when institutional pushbacks or fact-checks on a claim are slow, boring, or already distrusted.
And today, reality has splintered along factional lines.
The handoff
The rumor mill alone has always had the potential to be dangerous—pitchfork-wielding mobs once carried actual pitchforks. But something very bad is happening very frequently now: online rumors are being co-opted by political elites, who see value in them. They’re messages that already resonate, viral moments that already have momentum. And political leaders, statesmen, who once served as a firebreak against baser instincts increasingly see an opportunity to look receptive to the base—so they add kindling instead.
If rumor mills are how communities make sense of the world amongst themselves, propaganda machines are how elites shape narratives for mass consumption. The line between rumor mill and propaganda machine is indistinguishably blurred now; everything moves along the same pipes. Modern propaganda is participatory: the audience supplies the amplification energy and “evidence”—screenshots they’ve found, database entries they think they’ve discovered. Algorithms key off of content and engagement, and act as distributors.
But political elites supply the credibility that helps a rumor become “real.” Influencers often supply the framing: what does it mean, who is to blame, how should “our community” feel about it.
For another example of this merger between rumor mill and propaganda machine, recall “Eating the Pets.” The false rumor that Haitian immigrants were eating cats in Springfield, Ohio sparked considerable debate again on X two weeks ago…because JD Vance, Vice President of the United States, chose to reaffirm it in a recent podcast interview with the New York Post.
The massively viral 2024 lie that Haitians were eating pets in Springfield, Ohio wasn’t cooked up by a political consultant. It started like so many rumors do: with a woman sincerely relaying something she heard through a chain of acquaintances—that a neighbor’s friend’s daughter’s cat had been eaten—into a Facebook group. Her post was screenshotted, transferred to X, and then went viral as people gaped at the story and made AI-generated images illustrating it. Trump supporters with negative views of immigration repeated it because it felt emotionally satisfying and politically aligned. It was also just that kind of extremely-online bizarre.
And then “the handoff” happened—the inflection point when a rumor moves from anon and small-account chatter to pickup by big influencers and people with institutional or elite standing. Vance, then-Senator from Ohio and vice presidential candidate, ran with it. He didn’t lower the temperature—he was the accelerant. He gave the claim legitimacy, and things went from there.
Elite media and political actors on the right are remarkably effective at taking whatever rumor the MAGA online faction has begun to run with and turning the claim into a deeper story, a symbol, or a permission structure. Truth and facts are not the main concern. (The left is not immune to political rumors obviously, but the right’s crowd-to-elite handoff is unusually disciplined and repeatable, to the consternation of the left, for reasons I’ll explain in another post.)
Why “keep score” isn’t enough
In the aftermath of the Brown witch hunt, Graeme Wood offered a prescription in The Atlantic for how to handle accounts that spread viral rumors: “Don’t log off. Keep score.”
Track who speculated. Track who amplified. Track who hedged with “concerning.” Track who deleted later. Track who refused to correct. Adjust your trust accordingly. Reward restraint. Ostracize repeat offenders.
It’s appealing because it gestures at a mechanism that predates the internet: social sanctions. If someone repeatedly spreads false accusations that harm people, eventually they get ostracized from the community. They're perceived as untrustworthy. They lose standing. They feel ashamed.
But that only works if the community shares a norm that spreading false rumors is bad. And if they agree that the rumor was, indeed, false.
We’re not there anymore.
On social media platforms, influential liars are rewarded while the new crowdsourced factcheckers—Community Notes contributors—are unremunerated. The people willing to produce the most shameless bullshit gain followers, subscriptions, political access, and status. “Keep score” assumes the players are all playing the same game.
Which brings us to the deeper problem: “the end justifies the means” has become the guiding principle of the political project that has most effectively fused the rumor mill and the propaganda machine.
The Internal Critics
Some influential voices on the right have noticed. They describe what’s happening as a problem of conspiracism, which is true as far as it goes—but conspiracies are the symptom. The underlying problem is that insinuation and shamelessness pays.
This week at Turning Point USA’s AmericaFest, Ben Shapiro warned that “the conservative movement is in serious danger”—not from its traditional enemies on the left, but “from charlatans who claim to speak in the name of principle but actually traffic in conspiracism and dishonesty.” He called out Candace Owens for spreading conspiracy theories about Charlie Kirk’s assassination. He condemned the “just asking questions” tactic as manipulation: “When grown men and women spend their days ‘just asking questions’ without seeking answers, they’re lying to you.”
Chris Rufo recently wrote that “the Right’s media apparatus is how the Right teaches its followers how to think, and it’s currently getting consumed by conspiracy, psychodrama, and tabloid conflicts.” He described the public mind on X as being “shaped by the X algorithm”—and noted the impact of perverse incentives that come from the current monetization framework: “While the Right has long had a conspiratorial streak, such narratives have usually been restricted to marginal newsletters or pirate radio broadcasts. By contrast, on X, they generate enough controversy and spectacle to become the day’s top headline.”
These are not liberals. These are prominent conservative voices now starting to acknowledge that something has gone badly wrong.
Rufo’s commentary about the algorithm, platform power, and social media’s bad incentives is accurate. This is a supply-side complaint that the left makes as well—but he similarly neglects to fully reckon with demand, and, specific to the right, with the handoff. Shapiro tackles the handoff head-on, but internal correction only works if the people at the top stop doing the thing. Meanwhile, the Vice President is still lying about pet-eaters in Springfield (and he knows it). Elon Musk is amplifying stolen-election claims to hundreds of millions of followers as of yesterday. The incentives haven’t changed. The fish rots from the head.
So what is to be done?
Two things, neither satisfying.
First: internal and individual norm shifting
Shapiro is correct that the elites on the right need to push back against conspiracy theories from within. Nothing from the left, from academia, media, or fact-checkers, or anyone on the outside will do much here. Those outside the movement have been systematically discredited as biased hacks for years. But changing a political culture that treats “owning the enemy” as more important than honest reckoning with facts is hard.
Not seeing every unhinged rumor as something to accelerate or validate is something anyone can choose to do; this is generally true across the board in factionalized online politics. Some on the left made leaps to judgement based on very minimal evidence surrounding Charlie Kirk’s assassination. And in the Brown case, there was another young man whose name and LinkedIn profile briefly circulated on Threads prior to the Palestinian student’s—someone who had the same name as yet another man who was briefly a person of interest. That person took down his LinkedIn page, leading a few midsize influencers to become convinced he was guilty. This all happened at far smaller scale, and seemingly without political elite buy-in. But the point remains: people have to decide they want reality more than dopamine and dominance.
Second: make accountability great again
It’s my controversial opinion that defamation lawsuits should be far, far easier for private individuals who get screwed like this. I just don’t think the law has caught up to the infrastructure. Yes, defamation lawsuits are imperfect, slow, and expensive, and can be frivolously abused. But influencers who falsely accuse a private person of mass murder for clout and profit should face consequences. The legal system is one of the few mechanisms available to impose costs and deter this kind of behavior. The harm is real. Treating human beings as content is depraved. Defamation has always been outside the bounds of free speech, and our marketplace of ideas would benefit from less of it.
I wrote Invisible Rulers to explain how this ecosystem works—how crowds, algorithms, and elite amplification combine to turn lies into political reality. None of this is going to stop on its own. For the time being, the rumor mill will keep spinning. The propaganda machine will keep grabbing whatever it can use. And unless the culture changes—unless our old offline norms reassert themselves, and/or accountability starts to bite—the handoff will keep happening.
More things I’ve done since my last newsletter:
Speaking of shamelessness, smears, and profiteering: I posted a new entry on my “bad media object lessons” blog featuring recurring exemplar Matt Taibbi. This time Matt is cutting emails in half and lying about them while his assistant Andrew Lowenthal makes claims about a legal situation that are directly contradicted by his friend’s own court filings. Reading is hard, linear time can be vexing. Enjoy.
And finally, I hosted a Scaling Laws Lawfare podcast on AI chatbots and free speech with Jacob Mchangama and Jake Shapiro, who are fantastic. Want to hear about which bots are the most or least willing to produce politically sensitive content? How should we think about the needs of the requester vs the right of the company to set limits? It was a very thought-provoking conversation about some of their new research!
Merry Christmas and Happy Holidays to all!


Thank you Renee. I appreciate you and your work helping us decipher the new media terrain we find ourselves slogging through. Happy holidays and here’s to a better year ahead.
It’s not quite accurate to say defamation is outside the bounds of free speech, given Sullivan and its progeny. But a system of rewarding Davids who prevail over Goliaths would certainly help. Also making “just asking questions” actionable.