Friday, November 28, 2008

The aim of credences

"Beliefs are to truth as degrees of belief are to _____"

How do we fill in the blank? Certainly not with 'degrees of truth'. If I have a .8 credence that 'Crime and Punishment' is on the bookshelf, it is not the case that I'm representing some kind of vague state of affairs.

A much more plausible thing to say here is that degrees of belief represent the chance or likelihood that the world is thus and so. The spirit of this proposal is that there is some objective standard of chance or likelihood that our credences are attempting to latch on to. Now while it is entirely plausible that our world is chancy, I do not think that this is an adequate characterisation of what our degrees of belief are aiming at. We have credences about a whole host of things, most of which are not chancy, or not chancy in the right way. There is no question of chance with respect to whether 'Crime and Punishment' is on the bookshelf. It either is or it isn't. If I were to flip an indeterministic fair coin, then there is a 50% chance that the coin will land heads, and in that respect my credence of .5 that the coin will land heads accurately represents the chance that it will. However, if I have already flipped the coin, but have not yet seen the result, then there is no chance that the coin has or hasn't landed heads, and yet I still have a .5 credence that it has landed heads. If I have a .5 credence that it has landed heads, and in fact it has landed heads, then by the suggestion that credences represent chances I have misrepresented the world. I consider this an unhappy consequence. In fact, what we ought to say about the coin before it is flipped is that we have a .5 credence that it will land heads and a credence of 1 (all other things being equal) that there is a 50% chance that the coin will land heads.

There are two good-making features of doxastic attitudes. One is truth, and the other is justification. We want an epistemic theory that respects both of these aims. The problem is that it is very difficult to respect both of these aims at the same time. Suppose I have high level of justification for the proposition that p. What doxastic attitude should I adopt with respect to p? Certainly I should not have a credence of 1 in p. While I have a high level of justification for p, it is not that high. On the other hand, if I set my credence to something less than 1, and in fact p turns out to be the case, then I have actually deprived myself of being right about p. I have missed out on something good as it were. So we have a dilemma. Set your credence too high and you fail to meet the norm of justification. Set your credence too low (anything less than 1) and you fail to meet the norm of truth. Certainly you have something going for you if you don't set your credence to 1; p might have turned out to be false. If it were to have turned out false, you would have avoided doing something bad (we can suppose that it is as bad to believe falsely as it is as good to believe truly). So never setting your credences to 1 or 0, and never believing anything (if that is a distinct notion) allows you to avoid ever believing something false. However, you also prevent yourself from believing anything true.

Let's make the following simplifying though implausible assumption: if you believe something true then you score 1 point, and if you believe something false you lose a point. You're trying to score as many points as you can. In this case, you would be better off believing that p if you have a high credence that p than not believing that p. You're quite confident that you will score more points this way. But isn't it wrong to believe that p if you're not absolutely confident that p? Haven't you failed to meet the norm of justification? Well yes, you have (I say), but that may be a reasonable price to pay. You can't adequately meet both norms at once, but if all you tried to do was meet the justification norm, then you would never score any points on the truth norm. Attempting to meet both norms at once requires a trade-off. (I don't have any plausible way of measuring the trade-off at this stage).

"Beliefs are to truth as degrees of belief are to _____"

There is no way to fill in the blank, for there is no parallel job description for degrees of belief. Degrees of belief are not in the business of representing the world like beliefs are. Degrees of belief aim at representing your level of justification, while beliefs aim at truth. It is (relatively) easy to say what degrees of belief you ought to have in p, for any p; it is whatever level of justification you have for the truth of p, for any p. Of course it's very difficult to say what justification is, how we get it, and how much we have of it, but once we have an answer to these questions it is straightforward to say what level of credence one should have. It is considerably harder to say what you ought to believe given your credences. That requires some kind of trade-off. That requires saying what the value of truth is and what the value of justification is. It seems to me that the value of justification is entirely derived from the value of truth. If that thought is right and if, as seems to be obvious, we should rarely have degrees of belief of 1, then proponents of replacing belief talk with talk of degrees of belief are offering us an epistemology bereft of aim and value.

(Thanks to Alan Hájek for arousing my thoughts on this issue).

3 comments:

Anonymous said...

Thanks for this terrific blog, Leon - some compelling ideas here.

Suppose your credence that a given radium atom decays in a year is 0.001. You then look up a table of chances in your infallible physics book, and learn that the chance of that decay is 0.001. Wouldn't you feel vindicated? And that mightn't be because you fully believed that the chance was 0.001 (and so you got the vindication of having a true belief). You may have arrived at that figure by averaging over various live hypotheses about the chance. Still, your credence of 0.001 was vindicated.

Leon Leontyev said...

I'm inclined to say that I would not feel vindicated in the situation you describe. It strikes me that if you don't also 'fully believe' that the chance of the decay is 0.001 then whatever the basis of your credence, it will match the chance of decay only in virtue of luck.

Here is an analogous situation that will hopefully demonstrate why.

Suppose you have a coin in your pocket which you tell me is either double headed, or a normal indetermistic coin. You then ask me what credence I have in you flipping tails. For the sake of argument assume a principle of indifference for the hypothesis that it's a double sided coin vs a normal coin (perhaps you tell me that you flipped a fair coin to decide which coin to put in your pocket). It seems clear that I ought to have a credence of .25 in the coin landing heads.

But now suppose that as a matter of fact you have the heads-tails coin in your pocket, but what you did not tell me is that the coin is biased towards heads. It has a .75 chance of landing heads and therefore a .25 chance of landing tails. In this situation I don't feel at all vindicate that my credence of the coin landing tails matches the chance of the coin landing tails. The two have nothing to do with one another. I arrived at my credence by averaging over what I thought the chance of tails was if the coin was normal, and what I thought the (likelihood?) was of the coin being normal.

Similarly if I arrive at my credence about the decay of the atom, but not via an accurate belief about the chances of decay, then I can't see how the match between the credence and chance can be in virtue of anything other than luck, and therefore ought not to feel vindicated.

Anonymous said...

In the case you imagine, I say that you are VINDICATED BY LUCK. There's nothing strange about that. One can have true beliefs by luck. And so it is with credences, I say.