The Disturbing Power of Information Pollution

When we’re lulled into giving up on truth, we give up on critical thought — even dissent itself.
Examples of information pollution include "fake news," false political advertising, and propaganda. 
By: Michael P. Lynch

In January 2017, President Donald Trump’s official spokesperson asserted that the crowds for his inauguration were the biggest had by any president ever. The claim was false, and hilariously so: there was direct photographic evidence to the contrary, not to mention the testimony of those who had been present on the ground. Nonetheless, the White House insisted; and it was even alleged by some that the photos had been doctored. Facts — indeed, the values of truth and rationality — seemed not to matter.

I’d like to defend a contrary ideal: that democracies should be spaces of reasons. That is, democratic politics are at their best when disagreements are settled not by power or manipulation but by appeal to shared values, evidence, and facts. Indeed, in light of recent events — Brexit, the election of Donald Trump, and the rise of nationalism across the Western world — one might be excused for thinking this ideal is not just idealistic but irrational. Appeals to facts seem moot when those in power seem free to deny them, and when falsehoods come so fast from official sources that the media literally cannot keep up.

Jacket cover for "In Praise of Reason"
Michael Lynch is the author of the book “In Praise of Reason: Why Rationality Matters for Democracy.”

Perhaps no phenomenon more dramatically captures the present atmosphere than the rise of “fake news” — pure fiction masquerading as truth (like posts that the Pope had endorsed Donald Trump) — which may have spread wide enough to influence political elections. Yet fake news reports are actually just one kind of information pollution — a particularly virulent kind, no doubt, but not the only kind. Other kinds include propaganda; overtly false political advertising placed on social media sites, Twitter bombast — and the use of White House press briefings to assert falsehoods, no matter what the evidence may show.

The most obvious problem with any kind of information pollution is that it is a kind of lie for political gain. But framing the issue solely in terms of lying actually underplays and mischaracterizes the grander deceptions being perpetuated inside the internet’s funhouse of mirrors.

Lying is not quite the same thing as deception. To lie is to deliberately say what you believe to be false with the intention of deceiving your audience. I can deceive you without lying (silence at a key moment, for example, can be deceptive). And I can lie to you without deceiving. That may be because you are skeptical and don’t believe me, but it may also be because what I say is inadvertently true. Either way, you are lied to but not deceived. That might suggest that deception occurs when someone is actually caused to believe what is false. “Deception,” as philosophers say, is a “success term.” But that’s only halfway there. Deception can happen even without false belief.

Skepticism about objective truth remains tempting because it allows us to rationalize away our own bias.

One example is that old con the shell game. The con man presents three shells, one of which has a coin underneath. He moves the shells around and asks you to pick the shell with the coin. If done right, it looks easy, but isn’t. Using sleight of hand, he distracts you so that you can’t track the right shell and know where the penny is. But one can lack knowledge without having a false belief. One can be simply confused, and that is typically the case with such tricks. You don’t know what to think, and so you simply guess. You can be deceived not only by believing what is false — but by not believing what is true.

The use of social media to spread political misinformation online is partly just a giant shell game. Propagandists often don’t care whether everyone, or even most people, really believe the specific things they are selling (although it turns out that lots of people do). They don’t have to get you to actually believe the penny is under the wrong shell. They just have to get you confused enough so that you don’t know what is true. That’s still deception. And it is this kind of deception that dreadful for-profit conspiracy sites are particularly adept at spreading on social media. No doubt, some percentage of people actually believe such postings, but a far greater number of people come away ever so slightly more doubtful of what is true. They don’t know what to believe.

It used to be that when someone would say something outrageously false (“the Moon landing was faked”) it would be ignored by most folks with the reasoning that if it were true, they would have heard about it. By that, they meant they would have heard about it from creditable, independent sources. Filters (primarily, editors) worked to not only weed out the bad but to make sure the truly extraordinary real news made it to the surface.

The internet has made that reasoning moot simply because so many of us are ensconced in our own information bubbles. Few people reject crazy claims based on the fact they hadn’t heard about them before now, because chances are they already have heard about them, or something close to them, from the sites that tend to confirm their biases. That makes them more susceptible to taking fake news seriously, or at least not taking it unseriously. It makes them easier to confuse.

We are polarized, not just over values, and not even over just the facts — but over the very standards for knowledge we employ.

The internet — so crucial in the spread of fake news — didn’t create polarization, but it has sped it up. That’s partly because the analytics that drive the internet don’t just get us more information; they get us more of the information we want. Our life online is personalized. Everything from the ads we read to the political news in our Facebook feed is tailored to satisfy our preferences. As a result, it is easier than ever to get and share information, but the information we get often reflects ourselves as much as it does anything else. It inflates our bubbles rather than bursting them. And that puts us in the bizarre paradoxical situation I talk about in my book “In Praise of Reason”: We are polarized, not just over values, and not even over just the facts — but over the very standards for knowledge we employ.

Yet perhaps the most disturbing power of information pollution is that its repeated use can dull our sensitivity to the value of truth itself. That’s particularly so given that you and I live in a digital world which both makes it easier and harder to figure out what is true. Googling is like being in a room with a million shouting voices. It is only natural that we’ll hear those voices that are most similar to our own, shouting what we already believe, and as a result Google can find you confirmation for almost anything, no matter how crazy. But of course we are aware that those with different views can do the same. And that very fact, if we aren’t careful, can lead us into thinking that objectivity is a “dead value.” We get so used to contradictory information, rival sources, that we can talk ourselves into thinking it no longer matters.

This is not a new idea. During the last century, it was dominant in some philosophical circles. To some, the idea that objective truth is unimportant sounds like a sophisticated bit of real politic. To others, it seems liberating, because it allows each of us to invent our own truth. But really, it is a self-serving rationalization disguised as philosophy. It confuses the difficulty of being certain with the impossibility of truth. Look, it is always difficult to know for certain what is true. Maybe you really live in the Matrix. Maybe you have a brain chip implant feeding you all the wrong information. But in practice, we do all agree on some facts: that bullets kill people, that you can’t flap your arms and fly. There is an external reality. Ignoring it can get you hurt.

Yet skepticism about objective truth remains tempting because it allows us to rationalize away our own bias. When we do, we are like the guy who knows he is living in the Matrix but decides he likes it that way. After all, being right all the time feels good; getting what you want feels good. So we prefer to live in bad faith and see our cozy, curated information bubble as the measure of all things. And there is no better example of this bad faith but the fact the very term “fake news” has itself become a weapon to defend our own bubbles. “Fake news” now means any coverage you disagree with.


In George Orwell’s “1984,” the protagonist is tortured until he agrees that two plus two equals five. The point, his torturer makes clear, is to make him see that there is no objective truth other than what the party says is true. That’s the deep power of information pollution. It can lull us into giving up on truth altogether. And once we give up on truth, we give up on critical thought — even dissent itself.

It is for this very reason that it is crucial that we don’t give up on the ideal that democracy is a space of reasons. And neither information pollution nor knowledge polarization should make us abandon it. The defense I offer of this ideal is compatible with the fact that we are flawed reasoners, constructed out of crooked timber, prone to bias and confusion. And it is compatible with Wittgenstein’s dictum that reasons always come to an end. Reasons — and in particular reasons that result from the epistemic principles largely associated with science — matter because they embody democratic values. In giving reasons to each other, we exemplify a basic respect for our fellow citizens demanded by democracy — a respect of each other as fellow believers, thinkers, and participants in the decision-making process. That means that our political and epistemic values are, at the deepest level, intertwined. The hard part isn’t seeing this fact; the hard part comes in trying to make sense of how we should improve those values. The hard part is making sure that truth and freedom, as it were, take care of each other.

We may never completely realize the ideal of democracy as a space of reasons — that’s the point of calling it an ideal. We may never achieve social justice either. But that does not mean that it is not an ideal worth struggling to reach, and it would be perverse to give up striving for it just when it is under threat. That is exactly when we should look to our institutions and ourselves for new ways to solve the problem. Likewise, I’d urge, with the idea that democracy is a space of reasons. It is precisely now, in the age of information pollution, when the ideal matters most.


Michael Patrick Lynch is a writer and professor of philosophy at the University of Connecticut, where he directs the Humanities Institute. He is the author of several books, including “In Praise of Reason: Why Rationality Matters for Democracy” and “True to Life: Why Truth Matters.”

Posted on
The MIT Press is a mission-driven, not-for-profit scholarly publisher. Your support helps make it possible for us to create open publishing models and produce books of superior design quality.