Not so long ago, we had an understanding—however flawed—about what counted as truth. It wasn’t perfect. Journalism made mistakes, editors had biases, and cable news could be loud, theatrical, and deeply partisan. But for all its imperfections, the traditional press offered something that now feels oddly quaint: a shared reality. Even when we argued about what things meant, we were still looking at the same set of facts.
That consensus is gone. Not eroded, not diluted—gone.
In its place is an ecosystem built not to inform but to provoke. An ecosystem where the most valuable commodity is engagement, and where engagement thrives on outrage, fear, and confirmation bias. This is the age of the algorithm, and it’s quietly—and not so quietly—redefining what America thinks it knows.
Consider Truth Social, the Trump-backed platform birthed out of grievance, designed to provide a home for conservative voices who felt “censored” by Silicon Valley. On its surface, it’s simply one more social media platform among many. But look closer, and it becomes something much more consequential: a self-contained media universe where every news item, every video, every meme is filtered through a singular ideological lens. It’s hardly alone. TikTok, Facebook, X—each offers its own maze of algorithmic reinforcement. Depending on where you scroll, the world might look entirely different. Same country, same laws, same events—different realities.
These aren’t fringe platforms anymore. They are, for a growing number of Americans, the only news source. And that’s where the real danger lies. Because these platforms don’t operate like the press they’ve replaced. They have no journalistic codes, no editorial standards, no obligation to issue corrections or contextualize claims. They are not here to inform—they’re here to keep you scrolling.
And it’s working. We’ve all seen it: that friend who starts sharing conspiracies about vaccine microchips or rigged elections, the uncle who now insists every major news outlet is lying, the once-moderate neighbor who suddenly thinks the FBI is part of a globalist plot. This isn’t fringe behavior anymore—it’s algorithmically mainstream.
What makes this moment so perilous is that the collapse of shared reality is happening at the same time as a collapse of institutional trust. Americans don’t trust the press.
The consequences are no longer theoretical. We watched in real time as false claims of election fraud spread across social media, metastasized into belief, and then erupted into violence on January 6. That insurrection wasn’t planned in shadows—it was organized on feeds. Falsehoods, once shouted at barstools, now travel at the speed of light and arrive wrapped in the illusion of legitimacy. And when lies become louder than facts, the law gets involved.
In courtrooms across the country, lawsuits are now testing whether platforms should be held accountable for the content they amplify. Victims of harassment, disinformation, and even violence are seeking redress from the very companies that made these platforms possible. But standing in the way is Section 230 of the Communications Decency Act—a once-obscure provision that now sits at the center of a national reckoning.
Originally passed in 1996, Section 230 was designed to let the early internet grow without drowning in lawsuits. It shields platforms from legal responsibility for what their users post. In theory, it’s a free speech protection. In practice, it has become something murkier: a liability shield for billion-dollar companies that now shape how millions understand the world.
There is, oddly, bipartisan consensus that something has gone wrong. Democrats worry about disinformation, especially around elections and vaccines. Republicans claim censorship of conservative voices. Each side sees a different threat, but both recognize that platforms now wield extraordinary power over public discourse. And both seem increasingly unsure what to do about it.
What makes this moment so perilous is that the collapse of shared reality is happening at the same time as a collapse of institutional trust. Americans don’t trust the press. They don’t trust Congress. Many don’t trust the courts. In that vacuum, platforms become the new authorities. Not because they’re more trustworthy, but because they’re more available. More familiar. More addictive.
That addiction has legal and democratic consequences. Election disputes are no longer about ballots—they’re about belief. Did enough people believe the outcome was fair? Did enough trust the institutions that counted the votes? Increasingly, the answer depends not on the facts, but on the feed.
The law is not built for this. It struggles to keep pace with viral falsehoods and algorithmic manipulation. It was designed for defamation, not disinformation. For libel, not lies that go viral. Judges are left to navigate murky waters, trying to apply twentieth-century doctrines to twenty-first-century realities. What counts as speech? Who is a speaker? When does a platform become a publisher? The answers, for now, are uncertain—and that uncertainty is itself a kind of threat.
Some efforts are underway. There are calls to reform Section 230, to require greater transparency in algorithmic design, to fund local journalism, to teach digital literacy in schools. These are good ideas, important ideas. But they are swimming against a powerful current.
Because here’s the truth no one wants to admit: the algorithm is winning.
It’s winning not because it’s malicious, but because it’s efficient. It knows what you like. It knows what will keep you clicking. And it doesn’t care whether it’s true. That indifference—casual, ambient, structural—is perhaps the most dangerous force in American civic life today.
We are not doomed, but we are at a tipping point. Democracy cannot function without a shared reality. It cannot survive when truth becomes optional, and belief becomes tribal. If we don’t find a way to restore the foundations of factual discourse—through law, through media, through education—we will lose more than the news. We will lose the capacity to govern ourselves.
The platforms may not have intended to replace the press. But they have. And if we don’t start treating them with the same scrutiny, the same expectations, and—yes—the same legal responsibilities, then we’ll wake up one day and find that democracy didn’t die in darkness after all. It died in the light of our screens, liked, shared, and forgotten.