top of page

The Epistemologist's Moment


"Everything is Fine" by Jacques Rival

"Everything is Fine" by Jacques Rival


It’s our time. Or so philosophers like to think. Technological and cultural developments over the last 30 years have provided concrete data for philosophical theory-building. More than that, those developments demonstrate the practical relevance of philosophical work.

Self-driving cars raise the same kinds of issues as the long-mocked trolley problems. Developments in reproductive technology and life-prolonging techniques call out for input from medical ethicists. The crisis of mass incarceration requires intervention from ethicists as well, but also from epistemologists. We want to know how to respond to anthropogenic climate change, and ethicists, philosophers of science, and epistemologists all have things to say about that. Elon Musk – he of the 40 billion dollar net worth and 33 million Twitter followers – learned from epistemologists about the possibility that we’re living in a vast computer simulation. Artificial intelligence demands contributions from logicians, ethicists, and philosophers of mind. But nothing can match the opportunities provided for epistemologists by the 2016 U.S. Presidential Election. We had been preparing for this event for the previous 25 years without knowing it. Should you trust the conspiracy theories that went mainstream during the election? Epistemologists have been studying conspiracy theories since the mid-late 1990s. What should you do about fake news? Ever since C. A. J. Coady’s 1992 book Testimony: A Philosophical Study, the literature on the epistemology of testimony has exploded. How much weight should you give to the opinions of people outside your echo chambers and to news sources that share alternative political views? I refer you to the massive post-1995 literature on the epistemology of disagreement. How should you respond to belief polarization and the fact of your various cognitive biases? Just check out the literature on the intellectual virtues and virtue epistemology, dating to Ernest Sosa’s work in the early 1980s, but once again really coming into its own in the mid-1990s. Into this fray comes Nathan Ballantyne’s well-timed, rigorous yet accessible, engaging, and, ultimately, excellent Knowing Our Limits. In its final paragraph Ballantyne offers a description of the current age: We are caught up in unjust quarrels, ill-founded lawsuits, hasty opinions, and badly organized enterprises. Public opinion is swayed by campaigns of misinformation and propaganda. Technological systems beyond anyone’s control distort our most basic perceptions. What will we do with all of this? At the heart of our personal mishaps and collective tragedies, we find questions about knowledge and method, character and authority, opinion and ignorance. Ballantyne’s book is an attempt to demonstrate how epistemology can tell us how to practice inquiry in such a “time of crisis.” This is part of a tradition of regulative epistemology – “the kind of epistemology that aims to provide guidance for inquiry”. Once upon a time – as Ballantyne’s second chapter tells it – regulative epistemology was a standard way of approaching epistemology. But starting after the eminence of the Vienna Circle, epistemologists seemed more interested in devoting themselves to highly theoretical matters such as outlining necessary and sufficient conditions for knowledge or devising solutions to Cartesian scepticism. Unlike some recent regulative epistemologists, Ballantyne does not disparage these more abstract projects. But he hopes to put discoveries in more theoretical epistemology to use in devising a method (referred to as “the method”) you can use to decide what to believe and how to conduct inquiry. Much as a biomedical ethicist might use theoretical work about utilitarianism in arguments about how to construct hospital ethics codes, so too might a regulative epistemologist use theoretical discoveries about the nature of knowledge (along with discoveries in cognitive science, psychology, and sociology) to figure out how to inquire about conspiracy theories. This is Ballantyne’s “inclusive regulative epistemology.” Ballantyne focuses in particular on how you should approach areas of controversy. If you’re like most people, you have beliefs that you recognize aren’t shared by others. You’ll even admit that many of those who disagree with you are just as knowledgeable and competent as you are. You can have this kind of disagreement, for example, about science, religion, politics, ethics, philosophy, nutrition, and child-rearing. Of course, in some cases, you don’t think that those who disagree with you are as knowledgeable or competent as you are. Perhaps you think that those who deny the existence of anthropogenic climate change must either be unaware of the scientific facts, or be too influenced by personal biases to evaluate the evidence correctly. But, Ballantyne argues, in many cases you should realize that you don’t have reasons to assess others this way, or you have independent reasons to distrust your assessments of others. For example, you might realize that your tendency to regard others as biased is just as likely to stem from your biased tendency to discount your own biases.

Many recent philosophers, in response to these sorts of concerns and motivated by well-known perils of social media, advocate an attitude of openness toward the possibility that you are wrong. There is some disagreement about what this attitude should be called: “open-mindedness,” “epistemic modesty,” and “intellectual humility” are common contenders. But the view that the attitude is a virtue is widely held. Ballantyne says that when you have this attitude – when you have “significant doubt” about whether you should believe something or how confident you should be – you are “doxastically open.” His book is an extended argument for doxastic openness in a wide array of situations, especially involving controversial issues. Can you be unsure whether you should believe something, but still firmly believe it? Furthermore, can it be rational to be unsure whether you should believe something even while you firmly believe it? Perhaps. But once you are rightly unsure whether you should believe something, there will be powerful psychological and normative pressure to believe it less strongly. A recommendation to be open-minded, intellectually humble, or doxastically open is very close to a recommendation against having strong beliefs. That’s why, though Ballantyne explicitly denies that he is arguing for the necessity of increased scepticism, the lesson of the middle five chapters of the book is clear: many of us should be less sure about many of our controversial beliefs. For example, you might often find yourself faced with a volume of literature on some subject that you haven’t read through. In many cases, you won’t be able to figure out some specific way that the literature is biased (for example, if it’s a tobacco-company-funded study showing that nicotine isn’t addictive). When that happens – again, which, Ballantyne argues, it often does – you should reduce confidence in whatever controversial beliefs you have about that subject. Intellectual humility and confidence-reduction is the lesson of other phenomena, as well. Your controversial beliefs can be defeated by: • awareness of your own biases the massive amount of evidence you don’t have the fact that experts disagree with each other devastating counterarguments that could easily be presented by hypothetical disputants the fact that your controversial beliefs are borne upon by fields of inquiry in which you lack expertise or familiarity the scope of disagreement Ballantyne devotes five chapters to these potential defeaters for your beliefs. And while some of these difficulties are timeless – concerns about biases and expertise, for example, are age-old – most of them are exacerbated by mechanisms in the current age. Therefore, Ballantyne’s solution to the perils of the current age at least tends toward reduction of confidence. Again, in this, he is not alone. The expression “don’t believe the internet” yields 7,360,000 hits in a Google search. This popular sentiment is reflected in more sophisticated treatment among columnists and philosophers, for it might seem like pulling back on confidence levels is just the antidote for the toxins of the current age. Want to escape an echo chamber? Be less susceptible to cognitive biases? Depolarize? Be less likely to be duped by fake news that happens to confirm what you already believe? In all these cases, it might seem, the answer is to be less sure that you’re right and the other side is wrong.

It can be uncomfortable to admit ignorance about controversial issues. But, says Ballantyne, being doxastically open teaches you how to confront ignorance. It makes you comfortable with intellectual conflict. It gives you the ability to resist conformity. And it imbues you with a sense of wonder that can eliminate or at least outweigh any pains of confidence-reduction. Of course, it’s not easy to achieve these goals. Ballantyne tells us the principles that he thinks will lead, in the end, toward confidence reduction. He doesn’t tell us how to internalize those principles in a way that will allow you to be automatically guided by them. That’s a matter, he argues, for cognitive science to figure out: teaching us the principles themselves is enough for one book! But let us not downplay the downsides of confidence reduction. While giving up strong belief can make you more responsive to good counterevidence, it can also make you more responsive to misleading counterevidence and fake news. You can be inoculated against misleading evidence by strengthening your belief (this is one of the lessons of the psychological literature on “inoculation theory”). After all, you are rightly suspicious of news stories that run afoul of beliefs that are obviously true. If a news story that Barack Obama wasn’t born in the United States pops up on your news feed, you will be more suspicious of this story – you will downgrade the evidence – than of a story according to which Ted Cruz was born in Canada. Your prior beliefs dispose you to think that the Obama story is less credible than the Cruz story. Without strong prior beliefs, you would be more prone to believing the Obama story. More generally, we want false beliefs – like climate denialism, racism, sexism, and anti-vaccination beliefs – to be appropriately sensitive to contrary evidence. But we don’t want true beliefs to be overturned by misleading evidence. Strong beliefs provide checks on misleading evidence.


Photo Naomi Oreskes

There are also political reasons to maintain strong belief in some cases. The powerful often benefit from the ignorance of the populace. As the science fiction author Walter M. Miller Jr. wrote in A Canticle for Leibowitz: Ignorance is king. Many would not profit by his abdication. Many enrich themselves by means of his dark monarchy. They are his Court, and in his name they defraud and govern, enrich themselves and perpetuate their power.


That the powerful benefit from ignorance and seek to perpetuate it is one of the lessons of Naomi Oreskes and Erik Conway’s Merchants of Doubt, in which they demonstrate the lengths taken by various corporate and political interests to maintain ignorance about, for example, climate change and second-hand smoke. How can the powerful perpetuate ignorance? One way is to get you to believe falsehood. But it can also be perpetuated in other ways. Zeynep Tufekci notes in a 2018 column in Wired that: The most effective forms of censorship today involve meddling with trust and attention, not muzzling speech itself. As a result, they don’t look much like the old forms of censorship at all… They look like epidemics of disinformation, meant to undercut the credibility of valid information sources. They look like bot-fuelled campaigns of trolling and distraction, or piecemeal leaks of hacked materials, meant to swamp the attention of traditional media. You’ve been duped when you believe fake news. But you play into the hands of the powerful just as much when you reduce confidence or give up belief because of the possibility of error. You are ignorant when you’re wrong, but you are no more knowledgeable when you suspend judgement. Ballantyne details principles that entail that, often, you should reduce confidence or give up belief. But those principles can be exploited by the powerful as much as they provide good advice. For, as noted above, Ballantyne’s principles require you to reduce confidence or give up your belief when confronted with: • bodies of evidence you haven’t yet canvassed, much of which plausibly constitutes evidence against what you believe, barring some special reason to discount that evidence • experts who themselves disagree with each other, barring some special ability to tell which of those experts is reliable • intellectual peers who disagree with you, barring some special reason to think that they are more biased than you are. In all of these cases, the powerful can exploit these principles by manufacturing bodies of evidence that you lack the expertise to evaluate, playing up experts who plump for a wide range of controversial views, and convincing masses of otherwise intelligent, well-informed citizens that various heretofore universally dismissed views are true. All that is needed is a well-funded news outlet, popular social media networks, and a willing community of iconoclastic experts. There is a hope that, in each of these cases, you will have special reasons to discount the bodies of misleading counter-evidence, half of the relevant experts, and those intellectual peers who disagree with you. But, as Ballantyne argues persuasively in the book, you will often be left without the requisite special reasons. So, the very principles that Ballantyne uses to demonstrate that you should reduce confidence in many of your controversial beliefs also can be and are exploited by the powerful in order to maintain your ignorance. But maybe this is just the way it goes. True principles can be exploited, after all, and that perhaps is what makes it so effective when, say, oil-industry-funded climate sceptics receive disproportionate face time from media outlets.

What should you believe once you realize that the powerful are exploiting true epistemic principles to foster ignorance? Do you reduce confidence or suspend judgment, knowing that this is what they hope you do? Do you join the epistemic resistance and maintain confidence, knowing that doing so violates true epistemic principles and is, thus epistemically irrational? Traditional epistemologists, who are interested in, say, necessary and sufficient conditions for epistemic rationality, might insist that they have no business tackling these questions. What you are epistemically rational to believe might not be what, all things considered, you should believe. But regulative epistemologists, who are interested in how to conduct inquiry and form beliefs in the world as we find it, might not get to dodge these questions.


Photo Fake News by The Human Heart

All of this has assumed that Ballantyne’s principles are true. There is certainly intuitive appeal to the claim that, unless you have special reason to discount the factors Ballantyne sets forth – disagreeing peers and experts, unpossessed evidence, and others – you shouldn’t be as confident in your beliefs. But let’s not discount the intuitive appeal on the other side, as well. All of us have intuitive impulses to say that we know various controversial claims to be true. I personally take myself to know that sexual orientation is morally neutral, that there is anthropogenic climate change, that a widespread program of vaccination reduces health risks, and even that there is no all-powerful, all-good being in the universe. I also know that lots of people – experts and non-experts alike – disagree with me and with each other about these claims. I know that they have presented arguments for their views and that I haven’t studied all of them or are necessarily qualified to evaluate their arguments. But I nonetheless feel unconcerned. Am I thereby irrational? Do I need to wait until I find special reasons to undercut the counterevidence provided by these potential defeaters? Or can I rest content with what seems to me like knowledge that my beliefs are true? I see the intuitive appeal of Ballantyne’s proposal. I find it more intuitive that I can rest content with my knowledge that there is anthropogenic climate change.​

Now, examples involving climate change, vaccinations, and the moral status of differing sexual orientations may not be fair to Ballantyne. His principles allow you to maintain strong belief when there is “compelling evidence” for the belief (as, for example, there is for belief that the earth is not flat). Ballantyne explicitly singles out “human-caused climate change” as a case in which “the available evidence seems to demand a particular doxastic response”: namely, belief. Presumably he’d say the same thing about the health benefits of vaccinations and the moral status of sexual orientation (though, I would think, he wouldn’t say the same thing about my denial of the existence of God). Traditional epistemologists usually say that, when your evidence “demands” belief, then your evidence is strong enough to confer knowledge: knowledge, after all, is traditionally thought to be at least justified (or “demanded”) true belief. One very important question for Ballantyne, then, is how many controversial beliefs constitute knowledge. This is not a question we can answer by invoking Ballantyne’s principles, because his principles discuss what kinds of factors would require confidence-reduction, on the assumption that you lack knowledge – on the assumption that your evidence does not demand belief. It’s a question that requires traditional, theoretical, non-regulative epistemology to answer. We need to know what kinds of evidence are required for knowledge, and how much of that evidence is sufficient. Of course, this should be welcome news to Ballantyne, who never denies the regulative relevance of traditional epistemological investigation. But if I’m wondering whether it is permissible to retain my strong belief in the non-existence of God, say, despite the existence of expert and peer disagreement, as well as hypothetical and actual unpossessed evidence, Ballantyne’s principles alone won’t tell me. It’s not just the regulative epistemologist’s moment. It’s all of ours. Jeremy Fantl received his PhD in Philosophy from Brown University in 2000 and is currently Professor of Philosophy at University of Calgary. He works primarily in epistemology and is author of The Limitations of the Open Mind (2018).

 

From The Philosopher, vol. 108, no. 2 ('Questioning Power'). Read more articles from The Philosopher, purchase this issue or become a subscriber.

bottom of page