woensdag 12 februari 2020

Linear or Non Linear Management Style? Tank Storage Awards 2020



What research shows is that our industry is often managed by linear thinking and actions only which makes management styles reactive rather than proactive or preventive. Linear causality means cause and effect thinking, but our industry is too complex to manage by an outdated cause and effect approach only. Linear management depends on written procedures, guidelines, rules or compliance methods to control risk and manage organisations. Younger, highly educated types of managers are recruited, but often lack the needed on the job experience, making communication between the workers and him or her difficult. This often leads to decisions made on incomplete information. (THE MAIN CAUSE OF INCIDENTS and ACCIDENTS)
Non linear management to control complex systems such as our industry uses information as its energy to steer the organisation (system), rather than control or regulate it. Linear management is risky; I’ll give you some examples: Some technical equipment is not working properly. According to the procedure and planning, the next maintenance date is….  Let’s say in a few weeks. Management is prone to wait for that date because the equipment is not considered critical. Another example: operators report on a risky situation; at one of the truck loading bays there is no fall protection available, but trucks load anyway. The manager knows about this, but his clients are sending trucks that need to load there and he is unable to suspend loading due to commercial interests.  A third example taken from my own experience; a mistake is made by an operator, so management sends him to be trained. But the cause was not the mistake of the operator. The cause is: he had been sent into the field without proper preparation, so the mistake was not direct causal (by him) but by the failure or management that did not do what it was supposed to do and that is to be careful. These 3 examples have one thing in common: information was there but was not used whilst complexity was misunderstood and therefore overlooked.
What happens here is crucial to understand: information that is there should be used to correct in real time (immediately). Real time corrections are often postponed or not executed at all. From a scientific view point this is what is happening: the terminal or refinery is spinning out of control. Linear management is simply not enough. Remember the BP Oil Spill in Louisiana? Same causality; information about possible flaws and broken equipment was ignored. What if a manager does not understand what the operations are doing? Again it is the lack of information…
So we have created a scientifically sound training program for managers and supervisors to learn how to manage their operations using non linear thinking and understand the uncertainty principle. We teach them information theory, systems thinking and cybernetics and explain their scientific bases during the course. They will learn to look at relationships, interdependence of all the workers and stakeholders, their information and interconnectedness. The program also addresses sustainability issues, psychological biases and focuses on how to improve communication, cooperation and conversation. It certainly enhances sustainable and  responsible, safer management, because all relevant information is used all the time.
TTT has been training people worldwide for the last 10 years. Our team worked as an operator, sailor,loading master, captain or manager many years and are always learning. The risk is that managers believe that they already know, so they stopped learning. This behaviour is very risky indeed. Epictetus told us 2000 years ago that 'a man can’t learn what he thinks he already knows.'

This text was written to enter the 2020 Tank Storage Awards in the category Tank Terminal Optimisation. 


vrijdag 7 februari 2020

Conversation & Risk Management

The beauty of conversation is the exchange of ideas. We may not always agree, but it is the best human method to listen and learn. But what happens when you think you have exchanged ideas and one party nods as if he has listened, but then turns around and does the opposite? I was recently asked to assess a company because human errors were costing money. The management asked me to find so called 'learning gaps.' I found them, but not just at the workers level, but mainly in management. The managers were holding on to a top-down, hierarchical management system which was outdated and was blocking the information feedback flow needed to control, manage and steer the company. So, I advised that management needed training and to learn a new way of systems thinking and cybernetics. Despite the proof I offered and the solutions I suggested, these were contemptuously rejected as if Bertrand Russell was still alive. He wrote the 'Value of Philosophy' and stated that philosophy is the only science that asks 'all' the questions. Refusing to listen would mean that no learning would be possible and consequently no adaptation nor control of the organisation. The information in the form of 'negative feedback' which does not mean 'bad' feedback, but means correcting information, was disregarded, because it would have meant that management and not the workers were responsible for the errors made. In psychology this phenomenon is called a bias, here are 3 important ones:
Willful Ignorance: the state and practice of ignoring any sensory input that appears to contradict one’s inner model of reality. The practice can entail completely disregarding established facts, evidence and or reasonable opinions, if they fail to meet one’s expectations.
Confirmation Bias: The tendency for people to only seek out information that confirm to their 
preexisting view points and subsequently ignore information that goes against them.
But the most important, perhaps better know bias is:
Cognitive Dissonance:
A distressing mental state that people feel when they find themselves doing things that don’t fit with what they know or having opinions that don’t fit with other opinions that they hold. A key assumption is that people want their expectation to meet reality, creating a sense of equilibrium. Likewise, another assumption is that a person will avoid situations or information sources that give rise to feelings of uneasiness, or dissonance.
Sometimes people hold a core belief that is very strong. When they are presented with evidence that works against that belief, the new evidence cannot be accepted. It would create a feeling that is extremely uncomfortable called cognitive dissonance. And because it is so important to protect the core belief, they will rationalize, ignore and even deny anything that doesn’t fit with their core belief.
The sane person constantly analyses the world of reality and then changes what’s inside his or her head to fit the facts. That’s an awful lot of trouble for most people. Besides, how many people want to constantly change their opinions to fit the facts?
It is a lot easier to change the facts to fit your opinions. Other people make up their minds and they find the facts to verify their opinions. Or even more commonly, they accept the opinion of the nearest expert and then don’t have to bother about the facts at all.

What does this say about managers who refuse to learn or even look at the facts which oppose their preconditioned (re biased) opinions? Well in my opinion (no pun intended), managers must be tested for biases before they are appointed, because such biases inevitably cause that decisions will only be made on incomplete information.... This risk could be of course detrimental to the company's bottom line and forms the reason for increased disorder which inevitably spins out of control because of the butterfly effect, rendering the organisation unmanageable, thus in trouble. Such behavior is also very dangerous because it jeopardizes employees, the environment, share- and stakeholders. So what to do? Not much I am afraid. To accept a bias is almost insurmountable for the many, hence our political governance systems which can't seem to solve anything. Self-reflections distinguishes humans beings from animals, so what is needed is to listen to other opinions, talk and converse, communicate and accept that all information is needed and valuable, just like Russell said. So, when at this moment I am writing this blog my own reflection tells me that I made a mistake by not inquiring first if the management I advised would be open to new information. Perhaps I will do that when I am asked for advice again. In retrospect it is truly amazing how often I have encountered biased behavior and am quite pleased that I am now starting to understand it myself. You see, one is never too old to learn.

The Tepsa Incident analysed by our algorithm -HSEQ Competency Testing

TankTerminalTraining perforned this test on behalf of our industries which I hope will convince you and team about the significance of Van...