Thought for the Evening: Doxastic Synousia
When people speak of support for beliefs, they often are thinking in particular of evidential support and grounding -- something implies or else directly affects the probability of another belief. Part of the reason for this is, no doubt, that this is the easiest kind of support to make some kind of model for. There is also, I think, good reason to take this kind of support as being able, under the right conditions, to dominate other kinds of support. But any significant examination of how beliefs are confirmed shows that there do seem to be kinds of support that don't fit the standard models; that is, they don't involve any sort of strict logical implication, and at the very least we have no clear answer as to how they would directly affect the probability -- and yet the assessments of everyone, and I mean everyone, are constantly affected by them. There is, broadly speaking, a way in which beliefs support beliefs not directly but by fitting each other in other ways. Call this relationship confirmation by synousia -- the beliefs support each other just by going well with each other, without even considering strictly evidential relations. Some examples, all of which I would suggest are capable of being entirely rational and none of which can, I think, be reduced to one belief making another belief more probable in any definitely specifiable way. (I don't think these are the only examples.)
(1) If you start with a well supported universal claim, and then find that you have reason to make an exception, it becomes easier to allow the possibility of exceptions. That is, our having good reason beforehand to think the universal claim was true gives us a sense that any exceptions are probably merely apparent exceptions; but once we know that there is an exception in some particular kind of case, this does not seem so sure -- an exception in one case suggests there could be exceptions in other case. Now, this is not a matter of entailment; nor does it seem to be implied when we combine this with other general assumptions that we commonly make. Nor can we specify any definite way in which this is evidence that affects probabilities -- given a well supported universal claim and one well supported exception, we simply don't usually know what the probability of another exception is. Rather, what seems to have shifted is what we take to be a plausible candidate for evidence against the claim -- note, not what we take as evidence against the claim, but what kind of thing we take as seeming like evidence against the claim.
(2) Analogy certainly plays a role in confirming the beliefs of rational people, but attempts to reduce this confirmation entirely to making-more-probable clearly fail -- everything has an analogy to everything else, in varying degrees, and the degrees of analogy don't have any clear relation to any direct evidential assessment (for instance, it can depend on how important we regard the analogue). What is more, in many cases it is obviously only an analogy. Thus, for instance, the analogy of ideas with organisms leads many people to give greater importance to theories that treat ideas as undergoing a kind of natural selection. What seems to be happening is that a habit of thinking that has been found successful in one domain carries over, when evidence is not definitely against it, to other domains that have certain obvious similarities, even if there are also obvious differences. This carry-over by analogy often precedes any definite evidence; indeed, it is the carry-over that often leads us to search for whether there is any evidence.
(3) People often recognize that some theory about X is more credible when, if the theory is assumed to be true, we can get a clear sense of how to discover new things about X. Something can be more believable than something else because we believe it opens more possibilities for further inquiry into things we believe to be important. This doesn't seem to be a matter of one belief making another more probable; again, we don't seem to have enough information for the probabilities, and apparent importance is clearly playing a role. It's more like prioritizing than probabilizing.
(4) Our beliefs seem to be subject to a kind of social pressure that is not itself evidential. If we believe that someone we know leans one way, we can lean that way ourselves. While we could make an argument that this is (very indirect) evidence, all the arguments I have come across only are able to do this by assuming that all support of belief for belief is evidential. In practice, we often don't seem to be actually using it as evidence for why we should believe; it just makes something more salient for belief, by the ordinary course of human sympathy.
(5) We seem sometimes to find it easier to believe if we can easily fit it into a narrative or description we already have in place, even if it's not necessary for the narrative or description, and even if it, on its own, would not be a reason to believe the narrative or description -- the mere fact that we can put it in our already supported story or description without messing the story or description up, tells in its favor. It's consistent with our prior belief, but I suppose it's not just consistent, because it's consistent in a particular way: it's consistent with the narrative or description we already have reason to hold without making the narrative or description too cluttered or complicated to continue using it for what we use it for. It's also not just consistency with individual beliefs; it's about how easy it is to accommodate in an already existing structure of belief.
These kinds of cases are easily overlooked, in part, I think, because they aren't in most cases the kinds of things that "lead you to believe" something; rather, they make things more believable. They don't usually function as evidence, because they aren't so much evidence for what they support as things that make evidence more clear, or more plausible, or more accessible, or perhaps make us more open to something's being a thing that could be supported by the evidence. And in practice, probably what they contribute most (but not, I think, solely) to rational life is to help stabilize belief -- i.e., make it more resistant to change when there is a lot of noise in the evidence. My suspicion is that all the things that contribute to doxastic synousia are things that are already operative well before we've actually believed something -- that is, they are strengthening relations that are already working with guesses, suspicions, presumptions, hazy opinions, speculations, in short all sorts of cognitive states, and something's coming to be believed doesn't eliminate this influence; even though direct evidential support is obviously in many cases a more significant influence on belief, this doesn't mean that the background 'fit' with other beliefs, suspicions, etc., has a nonexistent role. And in all these cases the reason that the synousia seems irreducible to evidential support is that it's a regulative support, not a constitutive support like evidential support is -- that is, one belief's fit with another belief is not itself a fact about what is believed but a fact about how our minds work when we are inquiring rationally. This is why 'seeming' keeps coming up in describing how it works.
Various Links of Interest
* Jeremiah Lawson reviews Roger Scruton's Music as an Art
* Matthew T. Segall and Tam Hunt discuss Alfred North Whitehead.
* Darwin has a good post on the 'Problem of Susan'. I've never thought the Problem of Susan was any sort of problem at all; it seems usually to be pushed as a problem by either (1) people who like Susan for other reasons and (2) people who think they might be a bit like Susan. It's not particularly surprising that such people would prefer something different, but that's not really anything relevant to the story, particularly since what we are told is actually consistent with what we know of Susan directly from Prince Caspian and indirectly from Voyage of the Dawn Treader. Darwin's post got me thinking, however, about perspective, since we aren't told anything about Susan by the narrator but by the characters, and Darwin seems right that what the characters say about it seems affected by who they are. This is true also in the evaluation. Peter, the High King, seems harshest -- he answers "shortly and gravely", and doesn't say anything other than that she is no longer "a friend of Narnia". Eustace, however, seems mostly irritated at Susan treating Narnia as a child's game, while Jill and Polly seem from their different perspectives to see Susan's attitude as a sort of exasperating silliness associated with her age (and the sort of maturity that she thinks goes with it).
(I also find it interesting that when Peter says she's not a friend of Narnia, he is not saying it as a general claim, but explicitly giving it as the reason why she's not there with everyone else -- that was the question Tirian had raised. The reason that Susan is not there is that she's the only one who's not yet dead; the others died in a railway accident. They were not all in the same part of the railway accident -- Peter, Edmund, and Lucy were on the train platform, while Eustace, Jill, Digory, and Polly were on the train coming into the station, and it is clearly implied that they were all meeting together as 'friends of Narnia', which they occasionally did. It's not a claim, as some loons try to make it, that Susan is 'shut out of heaven'; what would Peter know about that, since at this point in the story he like everyone else hardly knows what's going on? It's why Susan isn't among those who were at the railway station: she no longer meets with the rest to talk about Narnia.)
* Richard Marshall interviews Elisa Freschi on Indian philosophy.
* Simon Newcomb's science fiction novel, His Wisdom the Defender. Newcomb is best known for his impressive work in applied mathematics, for being a good friend of Benjamin Peirce, and for (possibly) deliberately torpedoing the career of Benjamin Peirce's son, C. S. Peirce, with whom he had many disagreements.
* Stephen Boulter, The aporetic method and the defence of immodest metaphysics.
* John G. Brungardt, World Enough and Form: Why Cosmology Needs Hylomorphism (PDF)
* William Newton on a recent exhibition of Tolkien's paintings at the Morgan Library in New York.
* P. D. Magnus, Risk and Efficacy in 'The Will to Believe' (PDF)
Currently Reading
Charles Williams, War in Heaven
Plotinus, The Enneads
Xiong Shili, New Treatise on the Uniqueness of Consciousness