However, there are good grounds for the assumption that people align to a perceived status quo even in the face of counter-evidence. In the 1950s, the social psychologist Solomon Asch conducted his famous conformity experiments. Subjects had to solve fairly obvious perceptual tasks, but many gave wrong answers in order to align with the group: they disregarded the evidence right in front of them in order not to stray from the status quo. Since then, the experiments were repeated under various conditions, showing the detrimental effects of social pressure.
This is a common misunderstanding, but this is not, I think, what the Asch conformity experiments showed. A few salient points:
(1) Very few people conformed all the time -- almost everybody resisted conforming sometimes. About a quarter of the people consistently refused to conform.
(2) Looking at responses rather than people, the conformity led to only a little more than a third of the responses being wrong.
(3) Asch also looked at what people said about their own behavior, and while some people did in fact conform in order to conform, one of the most common responses was, at least in a broad sense, self-critique: a significant portion of the participants assumed that they had misunderstood the instructions. Later examination and experimental variations have also made clear that answers experimenters thought obvious were not always thought obvious by the participants; a number of participants were just honestly uncertain -- they usually could tell the right answer, but they weren't confident about it, and therefore were in fact checking their results against what other people were getting.
(Likewise, there may also be some reason to think, although it seems to be a less-studied phenomenon, that when people defer to the confident or apparently authoritative it is often because they are not considering being right but being responsible -- confidence or authority is often seen as a form of assuming the responsibility for the answer, and when people don't see themselves as having a specific reason to assume responsibility, they are willing to let someone who apparently wants to assume responsibility do it instead. One sees something like this kind of dynamic in group projects and discussions all the time.)
(4) And that relates to the most important point, which is that the Asch conformity experiments tell us nothing about detrimental effects of social pressure; there were no detrimental effects examined in the experiment, and 'being willing to recognize that you might have misunderstood' and 'being willing to check your answers when uncertain' do not necessarily have any detrimental effects, despite the fact that circumstances were deliberately rigged in this particular case to push toward a wrong answer. Thus the conformity experiments show us a more balanced picture than they are often said to show: people have a considerable resistance to social pressure, but are willing to give some benefit of any doubt to other people.
If any of this is right, it causes a problem for Lenz's larger argument; he wants to argue that this suggests that relentless criticism in philosophy leads to conformity, but in reality one would expect that (1) a lot of people will not in fact conform; (2) a lot of answers will not in fact be affected; (3) a lot of times people will not be actively conforming but just seeing where an apparently active line of thought goes; and (4) whether there is anything unfortunate in it all will depend on what other habits, dispositions, and the like are cultivated. He also has to assume, I think, that relentless criticism all pushes in one direction, which I think most people would find a counterintuitive assumption to make, at least in most cases, about adversarial culture in philosophy.
Lenz does give, though, what I think is the real problem with adversarial culture -- it may be very good at identifying problems, but it is very poor at constructing solutions. I've noted before that people regularly fail to make a proper distinction between research problems and fatal problems, and of course full philosophical bloodsport is no good for anyone. But I also think that some of the things that Lenz diagnoses as due to adversarial culture I would instead diagnose as due to treating persuasion as a major goal of philosophical argument, which often seems responsible for the pressure to conform to begin with.