One of my interests is in the phenomenon of what I call rational cascades. We often model reasoning as if it consisted in tiny little arguments of organized premises leading to conclusions by well-defined rules of inference, each argument essentially standing alone. But this is obviously not how human reasoning actually works, even among very rational people. Many of our inferences are, of course, probabilistic; but even more important than this, I think, is the fact that if you refute a premise, this doesn't just affect the argument immediately at hand; changing that premise is likely going to affect a whole lot of other arguments, inferences, and estimates, even if one does not yet know how. There might have been other things that partially depended on analogy with that premise, or were made probable by that premise, or are implied by that premise in combination with other things that have not been changed; it might affect your assessment of evidence elsewhere, or your evaluation of certain kinds of arguments.
In practice, this does not occur all at once, but slowly as our minds work out their views more consistently. A change causes a cascade, but in human beings the cascade takes time to promulgate. This is the explanation, I think, for a common phenomenon, namely, coming to realize that you already believe or don't believe something. I read something about Eddie Izzard once which talked about how he was performing one night and suddenly had the intense realization that he no longer believed there was a God -- not that he suddenly came to the conclusion that there was no God, but that he suddenly realized he already didn't believe there was a God. Now, that sounds a bit odd if you think of human reasoning only in terms of set arguments, but it makes sense in terms of cascades: the foundations eroded, and it just took time to realize it. One also finds plenty of cases in the opposite direction, people who suddenly realize that they've believed in God for a while without expressly putting it that way. In essence, some change happened that committed them, but working out the commitment and its effects was not necessarily straightforward. And we find this sort of thing across many different fields.
Of course, by the same token, given that we play off each other, rational cascades can happen in a more complicated way between individuals; thus when you look at the history of logic, you find that the area of logic, broadly construed, in which people have most often considered issues relevant to rational cascades is rhetoric. In a sense, in fact, you can think of rhetoric as the logic of rational cascades for human minds. Newman's An Essay in Aid of a Grammar of Assent, for instance, which puts itself forward as not so much a logic as a rhetoric, could very well be read as a partial theory of rational cascade. We tend not to think of rhetoric as having much to do with logic in itself, but of course this is a new thing.
The difficulty with dealing with rational cascades, of course, is that they are (1) complicated; and (2) catastrophic. They are complicated because there are so many possible relations among ideas and judgments (analogical, probabilistic, implicative, associative, etc.). And they are catastrophic because they really are cascades: there are changes and changes and suddenly some critical threshold is reached and the whole thing moves. The human mind works as a sort of sandpile. Add a grain or take a grain and the sandpile is much the same as it was; but add or take away enough and a few more grains force the whole pile to adjust. But it's clear that some grasp on them is possible: kinds of behavior and typical causes can be identified, and to some degree have been. It's also clear, however, that doing so requires going beyond what we normally think of as logic to look at how character, environment, and social interaction affect reasoning.