1) Big Numbers: This consists of using our difficulty in really comprehending how huge numbers work to say that beyond a certain probability, things become impossible. So in using this argument, you use other tricks to create an incredibly huge number that is allegedly the odds against something happening - and then say "See, it's too improbable!".
2) Perspective errors: using a priori estimates of something to predict the likelihood of a specific event that actually occurred. Shuffle a deck of cards, and then ask what was the likelihood of getting this specific order?
3) False independence errors: computing probabilities separately, and then combining them without considering how they interact. Given absolutely no information, what's the probability that my birthday is in July? 31/365. Then, separately, what's the probability that my birthday is in summer? 1/4. So the probability of my being born during the summer in July is 1/4×31/365.
4) Fake numbers: generally part of a big numbers argument. You want to inflate the numbers, so you include as many factors as you can. But some of them are hard (or even impossible) to figure out. So you just pull numbers out of the air, and throw them into the equation.
5) Misshapen search space: Make some event look unlikely by pulling a switch; instead of computing its probability in the real setting in which the event would occur, create a different setting where it's more unlikely, and then compute the probability there.
I think we might be able add a sixth, namely illicit anticipation of results, which might, however, be considered a more general error that sometimes underlies one of these five and sometimes underlies other errors entirely. People often go into statistical and probabilistic arguments with assumptions that bias toward one conclusion. Thus they assume that a given type of argument will have a given type of conclusion, without actually doing the mathematical work.
I would argue, incidentally, that the primary conclusion one should draw from all of this is that mathematicians, at least when interacting with the public or teaching people the basics, cannot let themselves rely on symbolisms and technical terms. Obviously mathematical research will be done in those terms, as will discussion of mathematics among mathematicians. But there is too great a danger when dealing with non-mathematicians of their attributing too much to the symbolisms and terms, as if their use in themselves conferred the rigor of mathematics. Of course, this is not true at all: mathematics done entirely without algebraic symbolisms and technical terminology would be just as rigorous mathematics done with them. Doing mathematics entirely in natural language would be an extraordinarily inconvenient way of doing much of mathematics, and would require an astounding amount of ingenuity for expressing truths and operations that could be put more easily and simply in technical terms and symbols. However, the rigor, or lack thereof, is in the reasoning, not the clothing used. But there is a great temptation -- and almost everyone has to overcome this temptation with technical fields in which we are not experts, so we are all in the same boat on this -- to proceed as if the symbolism or jargon were the rigor. Mathematics, so often regarded as the paradigm of rigorous reasoning, is especially susceptible to this abuse. And thus it is important for mathematicians to take the trouble not to begin mystifying and amazing the masses with the symbolism when they need to be showing them, even if only by approximation, the wonders of the reasoning the symbolism so elegantly expresses.