There is a thought experiment worth conducting before you attend your next meeting where someone is proposing a sensible change that will obviously improve things and is being met with the institutional equivalent of a brick wall wearing a tie.
Imagine a person who has spent thirty years building a self-concept around a particular set of skills and beliefs. They are, within that self-concept, competent, reasonable, and largely doing their best. Now introduce information that, if accepted, would require them to conclude that several of their foundational assumptions were wrong, that some of the outcomes they’re responsible for were preventable, and that the framework they’ve been operating within has been failing a category of people they were supposed to be serving.
Watch what happens. The information will not simply be evaluated on its merits. It will be evaluated on what accepting it would cost. And what it would cost is the story. The coherent, stable narrative of competence and good faith that the person has been maintaining, consciously and otherwise, for thirty years. That story is not given up easily. The nervous system does not distinguish between a threat to physical survival and a threat to identity. Both activate the same protective response. The information gets categorised as hostile rather than useful, the person presenting it gets categorised as a problem rather than a resource, and the system closes around itself like a fist.
Now scale that dynamic to an organisation of three hundred people, all of whom have their own version of the same story, and all of whom are embedded in a hierarchy that materially rewards the maintenance of existing frameworks and materially punishes the disruption of them. What you have is not malice. What you have is a system with a nervous system, and a nervous system under threat does what nervous systems under threat do: it protects itself first and evaluates the threat second, which is precisely backwards from what the situation requires.
This is the psychology of institutional resistance, and it is considerably more useful to understand it in these terms than in terms of stupidity or deliberate obstruction, though both of those exist and should not be entirely ruled out on empirical grounds.
The organisational psychology literature on this is substantial and rather depressing. Chris Argyris’s work on defensive routines, developed over several decades from the 1970s onward, describes how organisations systematically prevent the examination of the assumptions underlying their own behaviour. The defensive routine is not a bug in the system. It is, in a meaningful sense, the system doing exactly what it was built to do, which is to maintain stability and protect the people inside it from the discomfort of being wrong at scale. Edgar Schein’s work on organisational culture adds the dimension of what he called “basic assumptions,” the beliefs so foundational that the organisation has stopped treating them as beliefs and started treating them as facts. You cannot challenge a basic assumption from inside the culture that holds it, because the culture will categorise the challenge as a misunderstanding rather than a critique. You are simply confused. The assumption is not in question.
What this produces, in practice, is an environment where the people most likely to notice what’s wrong are also the people least likely to be listened to when they say so. The new employee who hasn’t yet absorbed the basic assumptions. The neurodivergent team member whose pattern recognition is excellent but whose social calibration doesn’t include the instinct to soften a true observation until it’s unrecognisable. The person returning from extended leave who looks at the organisation with fresh eyes and cannot quite believe what they’re seeing. These people are not problems. They are, in the specific technical sense, information. The organisation’s response to them tells you almost everything you need to know about whether it is capable of learning.
The organisations that learn are the ones that have developed what Argyris called “double-loop learning,” the capacity not just to solve the problem in front of them but to examine whether the framework they’re using to identify problems is itself part of the problem. This is rarer than it should be, because it requires the people at the top of the hierarchy to be willing to have their basic assumptions examined, which is asking rather a lot of people whose position within the hierarchy is partly predicated on the stability of those assumptions.
None of this means change is impossible. It means change is harder than it looks, and that the people who attempt it need to understand that the resistance they encounter is structural rather than personal, which is cold comfort when you’re on the receiving end of it but is at least accurate, and accuracy is generally a better foundation for strategy than outrage.
The person who notices what’s wrong with the system is not the problem. They are, to put it bluntly, the most valuable person in the room. The tragedy is that most systems don’t find out until they’ve already spent considerable energy trying to get rid of them.
References
Argyris, C. (1990). Overcoming organizational defenses: Facilitating organizational learning. Prentice Hall.
Argyris, C., & Schön, D. A. (1978). Organizational learning: A theory of action perspective. Addison-Wesley.
Schein, E. H. (2010). Organizational culture and leadership (4th ed.). Jossey-Bass.
Senge, P. M. (1990). The fifth discipline: The art and practice of the learning organization. Doubleday.
Wikipedia contributors. (2024). Organizational learning. Wikipedia, The Free Encyclopedia.https://en.wikipedia.org/wiki/Organizational_learning


Leave a Reply