Conversation & Risk Management

The beauty of conversation is the exchange of ideas. We may not always agree, but it is the best human method to listen and learn. But what happens when you think you have exchanged ideas and one party nods as if he has listened, but then turns around and does the opposite? I was recently asked to assess a company because human errors were costing money. The management asked me to find so called 'learning gaps.' I found them, but not just at the workers level, but mainly in management. The managers were holding on to a top-down, hierarchical management system which was outdated and was blocking the information feedback flow needed to control, manage and steer the company. So, I advised that management needed training and to learn a new way of systems thinking and cybernetics. Despite the proof I offered and the solutions I suggested, these were contemptuously rejected as if Bertrand Russell was still alive. He wrote the 'Value of Philosophy' and stated that philosophy is the only science that asks 'all' the questions. Refusing to listen would mean that no learning would be possible and consequently no adaptation nor control of the organisation. The information in the form of 'negative feedback' which does not mean 'bad' feedback, but means correcting information, was disregarded, because it would have meant that management and not the workers were responsible for the errors made. In psychology this phenomenon is called a bias, here are 3 important ones:
Willful Ignorance: the state and practice of ignoring any sensory input that appears to contradict one’s inner model of reality. The practice can entail completely disregarding established facts, evidence and or reasonable opinions, if they fail to meet one’s expectations.
Confirmation Bias: The tendency for people to only seek out information that confirm to their 
preexisting view points and subsequently ignore information that goes against them.
But the most important, perhaps better know bias is:
Cognitive Dissonance:
A distressing mental state that people feel when they find themselves doing things that don’t fit with what they know or having opinions that don’t fit with other opinions that they hold. A key assumption is that people want their expectation to meet reality, creating a sense of equilibrium. Likewise, another assumption is that a person will avoid situations or information sources that give rise to feelings of uneasiness, or dissonance.
Sometimes people hold a core belief that is very strong. When they are presented with evidence that works against that belief, the new evidence cannot be accepted. It would create a feeling that is extremely uncomfortable called cognitive dissonance. And because it is so important to protect the core belief, they will rationalize, ignore and even deny anything that doesn’t fit with their core belief.
The sane person constantly analyses the world of reality and then changes what’s inside his or her head to fit the facts. That’s an awful lot of trouble for most people. Besides, how many people want to constantly change their opinions to fit the facts?
It is a lot easier to change the facts to fit your opinions. Other people make up their minds and they find the facts to verify their opinions. Or even more commonly, they accept the opinion of the nearest expert and then don’t have to bother about the facts at all.

What does this say about managers who refuse to learn or even look at the facts which oppose their preconditioned (re biased) opinions? Well in my opinion (no pun intended), managers must be tested for biases before they are appointed, because such biases inevitably cause that decisions will only be made on incomplete information.... This risk could be of course detrimental to the company's bottom line and forms the reason for increased disorder which inevitably spins out of control because of the butterfly effect, rendering the organisation unmanageable, thus in trouble. Such behavior is also very dangerous because it jeopardizes employees, the environment, share- and stakeholders. So what to do? Not much I am afraid. To accept a bias is almost insurmountable for the many, hence our political governance systems which can't seem to solve anything. Self-reflections distinguishes humans beings from animals, so what is needed is to listen to other opinions, talk and converse, communicate and accept that all information is needed and valuable, just like Russell said. So, when at this moment I am writing this blog my own reflection tells me that I made a mistake by not inquiring first if the management I advised would be open to new information. Perhaps I will do that when I am asked for advice again. In retrospect it is truly amazing how often I have encountered biased behavior and am quite pleased that I am now starting to understand it myself. You see, one is never too old to learn.

Reacties

Populaire posts van deze blog

Tank Terminals Sustainability: can information theory help us reach our goals?

Is information the primary substance in the Universe?

BBS; Behaviour Based Safety Course