by Joseph Shieber
There’s an interesting reaction that I sometimes get from my colleagues in the natural sciences when I describe what I do. When I talk about epistemology – the study of knowledge – I often hear a version of the following response.
“Well, in the sciences we don’t really deal with knowledge at all. At best, we have a high degree of confidence in a claim, but we’d never say that we know it.”
In the faculty dining room, there’s seldom time seriously to discuss philosophy with faculty from other disciplines. Also, if I tried it, I might find myself sitting alone in the very near future. So I thought I’d take this opportunity to respond to my (imaginary) colleague.
To do so, I want to start by considering an argument of Saul Kripke’s. Kripke achieved fame early as a philosophical prodigy. He enjoyed widespread acclaim within the philosophical community first for his work in modal logic and later for his work in metaphysics, philosophy of language, and the philosophy of Ludwig Wittgenstein.
Kripke’s reputation also stemmed from his virtuoso lectures. He was able to lecture without notes on complex topics, the complete paragraphs tumbling out of his mouth, seemingly effortlessly. He was equally known as someone reluctant to put his ideas to paper, so for years many of the arguments attributed to him circulated in samizdat versions taken from notes from his lectures.
In the years in which many of Saul Kripke’s arguments circulated by word of mouth or in third-party notes of lectures, one of the most famous is what we can call the “Paradox of Knowledge’ argument.
Kripke himself has published a version of the argument in a recent paper, drawn from lecture notes of a lecture given to the Moral Sciences Club at Cambridge University in 1972. Here’s my paraphrase of the argument. I’ll comment on some of the steps as I go along.
(1) If someone knows that p and they also know that p entails q, and on that basis they infer q, then they also know that q.
This is the deductive closure of knowledge under known entailment. For example, suppose I know that Hattie drew a triangle. Suppose further that I know that something’s being a triangle entails that that figure has three sides. If, on the basis of that knowledge, I infer that Hattie drew a three-sided figure, then I thereby know that Hattie drew a three-sided figure.
(2) For any proposition p, if p is true that entails that any evidence against p is misleading.
Evidence against p is evidence suggesting that p is false. Since we’re to assume p is true, then any evidence against p would have to be misleading.
(3) Now suppose that someone – call her Henrietta – knows that p, and also knows that the claim made in premise (2) is true.
Let’s assume that Henrietta draws the appropriate inference from her knowledge of p and of (2). Then we know:
(4) Henrietta knows that any evidence against p is misleading.
(5) If someone knows that evidence is misleading, then of course they have no obligation to pay attention to that evidence. This means:
(6) Henrietta has no obligation to pay attention to any evidence against p.
If someone has no obligation to pay attention to something, then it is okay for them to ignore it. So:
(7) It is okay for Henrietta to ignore evidence against p.
Now, Kripke thinks that it is obvious that it is NOT okay for us to ignore evidence against our beliefs, even if they amount to knowledge. Here’s how Kripke puts the point (and catch the adroit humblebragging – given that, as I noted, it is well-known within the philosophical community that Kripke first rose to prominence on the basis of those “certain papers in modal logic”):
The commonsense view is, for example, that you do know that I have written certain papers on modal logic but that future evidence could lead you to change your mind about this. So, you should rationally leave yourself open to such changings-of-mind, even though it is the case that you know that I wrote these papers. The question is, why?
Kripke’s “Paradox of Knowledge” seems ideal for discussing the sort of hesitancy about knowledge that my colleagues express. If the argument itself is sound, then it would seem that having knowledge entitles you to ignore new evidence. And, as my colleagues from the natural sciences note, openness to new evidence is a cardinal virtue of scientific thinking.
What makes Kripke’s “Paradox” a paradox, of course, is that the argument seems sound, but its conclusion is incompatible with the philosophical consensus about knowledge.
Most philosophers nowadays think that knowledge is fallible. In other words, they think that you can know something without its being certain for you.
Here’s an example. I know right now that my kids are still in their rooms upstairs asleep – or just waking up – as I type these words on a laptop in my study downstairs. But of course I can’t see them right now. And the house is quiet; I’ve woken up early to work on my monthly 3QD contribution. So for all I know, the kids might not be there.
I don’t want to engage in the sort of catastrophic thought experiments (kidnappings, etc.) that are the contemporary philosopher’s stock-in-trade, so I’ll leave it to you to imagine why my kids – eight and four – might not be safely ensconced in their rooms. My point here is just that it’s not certain that they’re there. Nevertheless, it seems to me that I can know that they are, even without going upstairs now to check on them.
Since knowledge doesn’t have to be certain, it would still be possible to gain additional evidence for something that you know – at least for those things that you know with less than absolute certainty.
You see your coffee mug next to your favorite chair. But wait – is it really your mug? You turn it over and look for the little chip on the bottom that happened when you accidentally set it down too hard on the corner of your desk.
It seems plausible to me that the initial, visual evidence of seeing your coffee mug was enough for you to know that it was yours. Nevertheless, by observing more closely and consulting your memory about the history of the mug, you gain even greater evidence that the mug is yours. The important point for me right now is that this additional evidence that you gain is evidence over and above the evidence you need for knowledge.
If this is correct, however, then having knowledge should be compatible with being open to new evidence. Even if you already know something, you might still value having ever more certain knowledge!
So something indeed seems paradoxical about Kripke’s “Paradox of Knowledge”. The consensus view that knowledge can be fallible – that it need not be certain – lends itself to the idea that we should be open to new evidence – even about things that we know. So how do we reconcile this consensus view with Kripke’s argument?
I want to focus on the place in the argument where having knowledge starts to have practical implications. That’s at step (5).
It’s worth pointing out, though, that it’s not clear that knowledge is the problem.
We can reformulate that first step in terms of “having a high degree of certainty” – the phrase that my natural science colleagues favor:
(1’) If someone has a high degree of certainty that p and they also know that p entails q, and on that basis they infer q, then they also have a high degree of certainty that q.
(I’m leaving “they also know that p entails q” as-is, because entailment is a logical relation.)
Step (2) deals with truth, not knowledge, so just with steps (1’) and (2) we could conclude that Henrietta has a high degree of certainty that any evidence against p is misleading. And if Henrietta’s degree of certainty is high enough, it might well be prudent for her to ignore the evidence against p as misleading. (I’ll come back to this point in a moment.)
So it looks like we can reach the same conclusion – that it’s permissible to ignore new evidence – without appealing to knowledge at all! And this is why at least some scientists have turned their ire on truth, rather than knowledge. (See, as a recent example, Julia Shaw’s essay “I’m a scientist, and I don’t believe in facts”.)
With this detour in place, we can now return to Kripke’s argument – and, in particular, step (5).
According to step (5) of the argument, if you know that some evidence is misleading, then you have no obligation to pay attention to it. As stated, however, this step is simply false.
Consider the following case. I teach a seminar for first year students on the topic of propaganda. Every year, I have students who attempt to argue that there is a scientific controversy about human-caused climate change. I know that they are wrong about this, and that any evidence that they present must therefore be misleading. Nevertheless, as an educator it is my responsibility to pay attention to the evidence they present, and to try to help them to appreciate where they’ve gone astray.
So one reason why you might wish to pay attention to misleading evidence, even if you know that the evidence is misleading, is because you want to engage with those presenting the evidence and to help them to understand that it is misleading.
Another reason why you might want to pay attention to such evidence is that answering the challenge to your knowledge might help you to understand it more deeply. Or refuting the misleading evidence might give you ever more certain knowledge – even more evidence than you need for mere knowledge!
Ah, but perhaps you’ll object that this isn’t the sort of “paying attention” that Kripke has in mind! When he thinks of paying attention to contrary evidence, he thinks of it along the lines of “leaving yourself open to … changings-of-mind”. And if you genuinely know something, then it wouldn’t do to have your mind changed – if you’ve already arrived at the truth, letting go of that belief would not be a change for the better!
One way that you might think about Kripke’s “Paradox” is that knowledge has two aspects. On the one hand, it involves an objective component: that what you know has to be true, and that the support that you have for your knowledge has to be objectively strong (biases or wishful thinking don’t count!). But on the other hand, your knowledge involves a subjective component too. At the very least, knowledge entails belief: if you know that it’s raining, on the standard conception of knowledge that includes that you believe that it’s raining.
The subjective component might help to explain the reason why we’re uncomfortable with the fact that knowledge seems to license you to ignore evidence you know to be misleading. That’s because often you merely think that you know, but are mistaken. And too easily can you have the subjective components for knowledge – belief, say, and perhaps what you take to be good evidence – without having the objective components. In other words, you can think that you know, but what you think you know can turn out to be false! And this can be true even for deeply-held beliefs.
So one reason why it is important to pay attention to what you take to be misleading evidence is that when you think you know something, you’re not always right!
This diagnosis strikes me as correct. But I’m not sure that it’s so obvious in every case that you should pay attention to what you believe to be misleading evidence. To see this, let’s think a little bit more about the case where you have a high degree of certainty that what you believe is true, and therefore that any evidence to the contrary is misleading.
Suppose you’re a research scientist, and in the course of developing your research program you have come to believe some hypothesis H with a strong degree of certainty. Suppose further that, to further your research, it is most efficient for you to operate under the assumption that H is true; it would require a lot of years of completely new work to abandon that assumption and to question the truth of H. In that case, it strikes me as not irrational for you simply to pursue your research and not to explore what you take to be misleading evidence that H is false.
Now, in writing this I emphasized the “you”. That’s because, although it is not irrational for individual scientists to pursue their research programs and not constantly to attempt to second-guess their own research, it is in the interest of the enterprise of science for other scientists to attempt to disprove hypotheses – even those that are held with a high degree of certainty by their colleagues.
Indeed, there is a high-risk/high-reward structure to the pursuit of disproving a hypothesis that enjoys a high degree of certainty within some scientific community. High-risk, because if many very smart people believe it, there’s a good chance it’s probably correct. High-reward, because if you happen to be the person who does disprove that widely-held hypothesis, you’re likely to enjoy a great deal of notoriety. (For more on this sort of analysis of the structure of science, see Philip Kitcher’s The Advancement of Science.)
So my reaction to Kripke’s “Paradox”, taken to involve the tension between having knowledge and being open more specifically to “changings-of-mind”, rather than merely to further evidence, is twofold.
First, suppose that Kripke really is interested in genuine knowledge. In that case, I don’t think there is a paradox. That’s because the conclusion of the argument is one that we should actually embrace! If you really know something, then it would be a mistake to change your mind.
Second, the reason why Kripke’s “Paradox” seems paradoxical is because we confuse genuine knowledge with merely taking ourselves to know something. And merely taking yourself to know something doesn’t license you to ignore evidence that conflicts with what you take yourself to know.
Even in this second case, however, we’ve seen reasons why it might be prudent for individual scientists to focus on pursuing research based on what they take themselves to know (or to be highly certain about). Part of the beauty of the structure of the scientific enterprise is that other scientists have an interest in serving as a check on their colleagues’ confident (or overconfident) claims.
Regardless, it seems to me that neither knowledge nor truth are incompatible with the scientific enterprise. That much, I know.