Of Blind Men and Elephants and Excessive Certainty

Blind men and the elephant

There’s an old poem about six blind men and an elephant, where each in turn declare that an elephant is like a wall, a spear, a snake, a tree, a fan, and a rope. Each accurately described what he was able to discern from his own limited point of view, yet all were wrong about the subject as a whole. As the poet noted:

Moral:

So oft in theologic wars,
The disputants, I ween,
Rail on in utter ignorance
Of what each other mean,
And prate about an Elephant
Not one of them has seen!

Sometimes our attitudes color our perception of others:

Management is often the butt of our disdain, expressed in cartoon form:

However, as Sandro Mancuso related in “Not all managers are stupid”:

I still remember the day when our managers in a large organisation told us we should still go live after we reported a major problem a couple of months before the deadline…There was a problem in a couple of unfinished flows, which would cause hundreds of thousands of trades to be misreported to the regulators. After we explained the situation, managers told us to work harder go ahead with the release anyway.

How could they tell us to go live in a situation like that? They should all be fired. Arrested. How could they ask us to drop the quality and go live with a known problem of that size?…

More than once we made it clear that focusing our time on getting the system ready to production would not gives us any time to finish the automation for the problematic flows and thousands of trades would be misreported. But they did not listen. Or so we thought.

After a few meetings with the business, we discovered a few things. They were not being irresponsible or stupid, as we developers thought. The deadline was set by the regulators and could not be moved. The cost of not reporting the trades was far higher than misreporting them. Not reporting the trades would not only be followed by heavy fines, but also by possible reputation damage. Companies would have extra time to correct any misreported trades before being fined.

For us, in the development team, it was the first time we realised that going live with a few known issues would be better than not going live at all.

Designing the architecture of a solution, at its core, is an exercise in decision-making. Whether the system in question is a software system or a human system, effective decision-making must be preceded by sense-making to identify the architecture of the problem. Contexts need to be identified in order to be synthesized into the architecture of the problem.

Bias, being too certain of our understanding to make the effort to validate it, is a good way to miss out on what’s in front of us. Failing to recognize our potential for bias makes it harder to overcome that bias. That failure restricts our ability to appreciate the full range of contexts to be synthesized and puts us in the same position as the blind men with the elephant.

It’s extremely difficult to solve a problem you don’t understand.

Advertisements

8 thoughts on “Of Blind Men and Elephants and Excessive Certainty

  1. Sometimes? I think our biases ALWAYS color how we perceive people, things, ideas . . . ok, everything. How we decide to deal with our biases such as seeking out other opinions outside of comfort zone is critical. Architecture decisions typically can have long lasting impact across the entire organization. I agree that finding a pallet of strategies to understand your biases is important.

    Like

    • Yes, I was being charitable by saying sometimes. Always would be much closer to the truth.

      For what it’s worth, I believe the person who says “okay, my initial reaction is to love/hate this, but I’m going to force myself to push through and try to be more objective” is a lot more self-aware than the one who says “I have no biases”.

      Like

  2. Another great post Gene, thanks!

    Interesting point in that piece from Sandro Mancuso. I would argue that the problem here was mainly if no one told the development team why the decisions were made. If so, you can only wonder how much time was wasted on gripes and how many people went to work hugely demotivated because they were asked to work on something they were convinced would end up being the proverbial train wreck.

    Definitely also highlights why for me the rationale on any requirement is at least as important to capture as the requirement itself 🙂

    /U.

    Like

    • Thanks! I agree that poor communication is absolutely a factor in the example from Sandro Mancuso. A history of poor communication may be what’s behind the team’s assumption of the worst – biases aren’t necessarily without foundation.

      For what it’s worth, I start out with the idea of making this post equally about bias and communication, but to do so would have made it a lot longer. I’ll be tacking communication in a separate post later, at which time I’ll probably use the link to Sandro’s post again. In my opinion, two of the most likely causes of breakdown in an organization are poor communication and perverse incentives.

      Like

  3. Pingback: Ignorance Isn’t Bliss, Just Good Tactics | Form Follows Function

  4. Pingback: A Meaningful Manifesto for IT | Form Follows Function

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s