by Charlie Huenemann
Philosophers are prone to define knowledge as having reasoned one’s way to some true beliefs. The obvious kicker in any such definition is truth; for how am I supposed to determine whether a belief is true? If I already know what is true, why should I bother with some philosopher’s definition of knowledge? What’s the use of this stupid definition anyway? “Hey, I’m just doing my job,” replies the philosopher. “You wanted to know what knowledge is, and I told you. If you want to know how to get it, that’s another story — and for that you’ll have to pay extra!”
If we think of true beliefs as getting things right — really right, like if you asked God about it they would say, “Yep, that’s what I figure too” — then it is indeed difficult to see how we could ever know the truth, and not just because friendly chats with God are so exceedingly rare, but also because we don’t really know what we mean when we say “really right” instead of just saying “right”. The “really” is supposed to add some special oomph to the knowledge, an oomph we by definition can never experience or access: it is the knowledge of what is going on in the world when no one is knowing it, which is like trying to see what your face looks like when no one is looking at you. “Really”, in this context, just means: at a level that is impossible to attain. Trying to get something really right means never knowing for sure whether you in fact have it right.
Where does that leave us with regard to knowledge? Well, we could be pure-souled skeptics and insist that knowledge, real knowledge, is strictly impossible ever to attain. Or, being slightly more careful, we could at least insist that we can never know when we have it. Maybe God or other metaphysical chimeras are able to confidently pronounce whether this or that mortal attains knowledge, but these or those mortals can never know when they know. In that case, if this is the route we choose, we should simply strike the word “knowledge” from our vocabularies, as it is never going to come into any practical use.
On the other hand, if we think it is sometimes useful to say that we do indeed know some things and do not know others, then we might look toward what we really mean (er, sorry, just what we mean) when we make these claims. As near as I can figure, when I claim to know something, I am only announcing to myself or to others that henceforth I am going to act and speak as if that something is true. I am going to adopt that claim as a kind of policy guiding my future actions. So, for example, if I claim to know that Edward Kelley surreptitiously authored the Voynich manuscript, then I am adopting that claim as a policy to guide what I say and do. I will make the assertion in online discussions and in classrooms; I will scribble it in the margins of library books; I might get a t-shirt printed; I will hold forth on the matter in pubs, and so on.
Announcing such a policy also makes me available for criticism. If you have adopted a contradictory policy on the Voynich manuscript — thinking perhaps that Emperor Rudolf himself wrote it, or that aliens had a tentacle in the affair — then you and I are sure to have some words with one another. We have a clash of policies, and we will argue over which one it is most sensible to adopt. As we argue, you will point out to me that I do not have compelling evidence for my claim, which means my policy is going to make me look silly as I get stumped by some clearly legitimate problems and questions. Or you will show that my policy forces me to actively deny some other claims that experts take as their policies. Or, most disastrous of all, you might be able to point out that the policy I am adopting contradicts other policies I hold. (This is what made Socrates so annoying.)
On this view, what makes knowledge different from belief is that beliefs — or “mere beliefs” — do not put me into the same argumentative space. I don’t fully expect other people to adopt what I regard as merely what I believe. If I merely believe X, and I don’t claim to know it, then I don’t think you are irrational if you don’t believe X. It’s just a personal policy for me, so to speak, and you are free to adopt your own. We can discuss it, of course, but our discussion will be at a pretty low temperature. But when I claim to know X, that means in part that I think you are wrong if you do not also believe X, or even claim to know it as well. This attitude results from the policy I have adopted, which is to act and speak and insist as if X is true. That is what it means for me to think that X is true: it is for me to think that you are wrong if you deny X.
All of this also makes claims to knowledge quite a lot riskier than mere belief. In adopting the policy of acting and speaking as if X is true, I am opening myself up to all kinds of criticisms, arguments, incredulous stares, ridicule, name-calling, censure, and alienation. I am being so bold as to say that everyone who does not also believe X is wrong and (as we say out west) them’s fightin’ words. I might eventually end up having to eat humble pie and confess I was wrong to think that X is true, and everything I did or said in support of X was misguided. That was a bad policy I had committed myself to, as it turns out. I shall hope for others’ forgiveness.
Now someone may object that this account makes knowledge too easy. For can’t anyone decide to adopt a policy about anything, and thus (in my view) be said to have knowledge? Well, note first that, yes, people do in fact adopt policies about the darndest things and claim to know them. It’s a fad of overconfidence that is way out of control, in my opinion. But we only kick up a fuss when other people’s knowledge policies collide against our own. It is at that point that we start shooting links to each other and appealing to experts, etc, etc, in the hope of convincing each other that ours is the policy a rational person should adopt. But, you ask, in such an inglorious dispute, which of us really has knowledge? Uh-oh. I thought we had already warned against that move. The answer to the question of who has the “real” knowledge is worked out in real time, using all available strategies to figure out, in conversation with others, the policy a rational person should adopt, and accepting the risk of failure.