Leadership Anti-Patterns – The Great Pretender

Roman Mosaic with Tragedy & Comedy Masks as gargoyles above a water basin

My previous leadership type, the Growler, was hard to classify as it had aspects of both pattern and anti-pattern. The Great Pretender, however, is much easier to label. It’s clearly an anti-pattern.

Before entering the working world full-time, I worked in the retail grocery business (both of my parents also had considerable industry experience, both retail and wholesale). I ran into this type more than once. The type is distinguished by a lack of domain knowledge and/or experience, coupled with an apparent inability to trust anyone with knowledge and/or experience. Consequently, the default method of decision-making appeared to be “whatever someone suggests, do something different”. It was as if someone had mixed impostor syndrome and the Dunning-Kruger effect together and skimmed off the most detrimental parts of each.

I remember one July Fourth holiday where a Great Pretender tripled the order for sliced-bread and cut the order for hamburger and hotdog rolls in half. This was based on reading something that said Americans were eating healthier. Unfortunately, that message wasn’t communicated to the Americans in our community (bear in mind, this was many years ago and July Fourth) and we wound up with customers unhappy that we had run out of what they wanted to buy and a lot of soon-to-expire bread that had to be marked down drastically so that it sold before going out of date. The failure to trust subordinates with the right expertise carries costs.

Another Great Pretender questioned his stock crew when he found them taking a break while a truckload of merchandise remained in the back room. The response, “Do you know how long it takes to get all that put on the shelf?” sent him scurrying away. The crew, who were malingering, had a good laugh.

The combination of lack of knowledge and lack of trust opens the door to an interesting manipulation strategy. When you want something from a Great Pretender, you never ask for it directly. It’s always “Boss, should I do this incredibly inane thing that no one in their right mind would do or should I do what I actually want to do?” The response from the Great Pretender is always “Do that second thing” (every single time). I leave it to you, dear Reader, to only use this knowledge for good, not evil.

The idea that someone in a leadership position should be the best at all they oversee is a pretty common one. More than once I’ve seen people claim that the have no respect for a leader that can’t do their job better than them. This attitude, however, fundamentally misunderstands the nature of leadership (hint, effective leadership is more about coordinating the team than doing any one job on the team). This attitude also demonstrates a lack of understanding of the cognitive capacity that would be required to lead a team involved in a minimally complicated undertaking (hint, effective leadership is more about coordinating the team than doing any one job on the team). This attitude also ignores the fact that a leader is responsible for tasks unique to their position (hint, effective leadership is more about coordinating the team than doing any one job on the team). When a team member has this attitude, it can be a problem.

When the leader buys into this attitude, we get the Great Pretender.

Leaders have their own roles and responsibilities to fulfill. This involves dealing with what’s appropriate to their role and relying on others for what’s appropriate to theirs. This requires communication and collaboration. Micro-managing and insecurity are counter-productive. The best leaders, in my opinion, are those that can recognize talent in others and gather around themselves a team of people with complementary strengths. They’re not the experts, but expert at helping a collection of experts come together for a common purpose. That involves placing trust in those being led.

Having to know it all can be fatal.

The Ignorance of Management – Deep and Wide

Iceberg

While on LinkedIn a couple of weeks ago, an interesting graphic caught my eye. Titled “The Iceberg of Ignorance”, it referred to a 1989 study in which:

…consultant Sidney Yoshida concluded: “Only 4% of an organization’s front line problems are known by top management, 9% are known by middle management, 74% by supervisors and 100% by employees…”

The metaphor of the iceberg is simple to understand. The implications of these numbers, however, require some unpacking so as to understand the full nature of the problem. Once the problem is better defined, effective solutions can then be proposed.

A naive reading of this would be that the line-level employees know everything and that the higher you travel up the hierarchy, the more out of touch you get. That reading, however, fails on two counts. Firstly, it ignores the qualification of “front line”, which will not apply to all problems faced by the enterprise. Secondly, it fails to account for the fact that while 100% of front line problems will be known by front line employees, that’s not the same as saying that each front line employee will know 100% of front line problems.

It’s a question of cognitive capacity, both depth and width. As organizations grow, the idea that any one person could be aware of each and every detail of the operation becomes laughable, even assuming perfect communication (which is an extreme assumption). This difficulty is compounded by cultural conditioning to expect those in charge to know what’s going on:

Unfortunately, I suspect the vast majority of leaders and managers believe they should have all the answers — even though they couldn’t possibly know everything that’s going on at all levels and in all departments within their organization. And even though the world is changing so quickly that what we know right this second … may not be true and accurate anymore … in this second.

But because we’ve been entrained to have all the right answers, all the time, many of us put on a brave face and pretend we know — particularly when our boss asks us a question, or when a direct-report does. After all, we want to look good. We want to seem “on top of things.”

Pretending to have all the answers is stressful. It’s lonely. It’s draining.

And what if, when we are pretending to know, we give an answer that we later discover is wrong? Yikes! Now what?

In this situation, many people feel forced to “stick to their guns,” even in the face of conflicting evidence. So they wind up suffering from stress, anxiety and fear that they’ll be found out.
They may even hide the “correct answer” to save face, which certainly doesn’t do their conscience — or their company — any good.

Can you see how this need to have all the answers, all the time, can contribute to a culture of assumptions, half-truths and even outright lies?

In this sort of environment, do you think people are connecting deeply and sharing freely? Of course not. They’re competing with one another and hoarding information, because they believe the person with the right answer wins!

This culture of denial, delusion, and deception is how organizations arrive at the extreme situation I discussed in my last post, “Volkswagen and the Cost of Culture”. Casual dishonesty, towards others and yourself, leads to habitual dishonesty. Corruption breeds corruption. Often, it’s not even coldly calculated evil designed to profit, but ad hoc “going along to get along” to avoid consequences – impostor syndrome on an epic scale.

Command and control (the term of art from military science, not the pejorative description for micromanagement) is a subject with a long history. It’s frequently abbreviated as C3, since communication is an integral component of the discipline. A pair of techniques with a pedigree going back to the 18th century have a track record of success.

In his post “Auftragstaktik and fingerspitzengefühl”, Tom Graves describes these techniques:

The terms originate from the German military, from around the early-19thC and mid-20thC respectively. They would translate approximately as:

The crucial point here is to understand that they work as a dynamic pair, to provide a self-updating bridge between strategy, tactics and operations, or, more generally, between plan and action.

In the post, Tom describes these techniques through the example of the air defense system used by Britain in WWII. In this system, information flowed into the command centers from observers, and radar installations. The information was combined and contextualized, then sent back out to airfields and anti-aircraft batteries where it was used to repel air attacks. Tom noted that this is often depicted as a linear flow:

Yet describing the Dowding System in this way kinda misses the point – not least that there’s alot of information coming back from each of the front-line units at the end of that supposed one-way flow. Instead, the key here is that auftragstaktik and fingerspitzengefühl provide afeedback-loop that – unlike classic top-down command-and-control – is able to respond to fast-paced change right down to local level.

The linear-flow description also misses the point that it depends on more than information alone – there are key human elements without which the auftragstaktik / fingerspitzengefühl loop risk fading away into nothingness. For example, auftragstaktik is deeply dependent on trust, which in turn depends on a sense of personal connection and personal, mutual commitment, whilst fingerspitzengefühl depends on a more emotive form of sensing, of feeling, of an often-literal sense of ‘being in touch‘ with what’s going on out there in the real-world

The Dowding system worked by combining effective communication and realistic command & control methods. Bi-directional communication improved situational awareness both up and down the hierarchy, providing detail to the upper layers and context to the lower ones. Realistic command and control can be summarized like this: since no one can possibly have full breadth and depth of knowledge about the situation, provide direction appropriate to your level (in accordance with the directions you received) and delegate to your subordinates the decisions appropriate to their level. In theory, as well as largely in practice, this resulted in decisions being made by those with the best knowledge to do so (again, guided by general direction from above).

A system that works with reality, rather than against it? What a concept.

Ignorance Isn’t Bliss, Just Good Tactics

Donkey

There’s an old saying about what happens when you assume.

The fast lane to asininity seems to run through the land of hubris. Anshu Sharma’s Tech Crunch article, “Why Big Companies Keep Failing: The Stack Fallacy”, illustrates this:

Stack fallacy has caused many companies to attempt to capture new markets and fail spectacularly. When you see a database company thinking apps are easy, or a VM company thinking big data is easy  — they are suffering from stack fallacy.

Stack fallacy is the mistaken belief that it is trivial to build the layer above yours.

Why do people fall prey to this fallacy?

The stack fallacy is a result of human nature  — we (over) value what we know. In real terms, imagine you work for a large database company  and the CEO asks , “Can we compete with Intel or SAP?” Very few people will imagine they can build a computer chip just because they can build relational database software, but because of our familiarity with building blocks of the layer up,  it is easy to believe you can build the ERP app. After all, we know tables and workflows.

The bottleneck for success often is not knowledge of the tools, but lack of understanding of the customer needs. Database engineers know almost nothing about what supply chain software customers want or need.

This kind of assumption can cost an ISV a significant amount of money and a lot of good will on the part of the customer(s) they attempt to disrupt. Assumptions about the needs of the customer (rather than the customer’s customer) can be even more expensive. The smaller your pool of customers, the more damage that’s likely to result. Absent a captive customer circumstance, incorrect assumptions in the world of bespoke software can be particularly costly (even if only in terms of good will). Even comprehensive requirements are of little benefit without the knowledge necessary to interpret them:

But, that being said:

This would seem to pose a dichotomy: domain knowledge as both something vital and an impediment. In reality, there’s no contradiction. As the old saying goes, “a little knowledge is a dangerous thing”. When we couple that with another cliche, “familiarity breeds contempt”, we wind up with Sharma’s stack fallacy, or as xkcd expressed it:

'Purity' on xkcd.com

In order to create and evolve effective systems, we obviously have a need for domain knowledge. We also have a need to understand that what we possess is not domain knowledge per se, but domain knowledge filtered through (and likely adulterated by) our own experiences and biases. Without that understanding, we risk what Richard Martin described in “The myopia of expertise”:

In the world of hyperspecialism, there is always a danger that we get stuck in the furrows we have ploughed. Digging ever deeper, we fail to pause to scan the skies or peer over the ridge of the trench. We lose context, forgetting the overall geography of the field in which we stand. Our connection to the surrounding region therefore breaks down. We construct our own localised, closed system. Until entropy inevitably has its way. Our system then fails, our specialism suddenly rendered redundant. The expertise we valued so highly has served to narrow and shorten our vision. It has blinded us to potential and opportunity.

The Clean Room pattern on CivicPatterns.org puts it this way:

Most people hate dealing with bureaucracies. You have to jump through lots of seemingly pointless hoops, just for the sake of the system. But the more you’re exposed to it, the more sense it starts to make, and the harder it is to see things through a beginner’s eyes.

So, how do we get those beginner’s eyes? Or, at least, how do we get closer to having a beginner’s eyes?

The first step is to reject the notion of our own understanding of the problem space. Lacking innate understanding, we must then do the hard work of determining what the architecture of the problem, our context, is. As Paul Preiss noted, this doesn’t happen at a desk:

Architecture happens in the field, the operating room, the sales floor. Architecture is business technology innovation turned to strategy and then executed in reality. Architecture is reducing the time it takes to produce a barrel of oil, decreasing mortality rates in the hospital, increasing product margin.

Being willing to ask “dumb” questions is just as important. Perception without validation may be just an assumption. Seeing isn’t believing. Seeing and validating what you’ve seen, is believing.

It’s equally important to understand that validating our assumptions goes beyond just asking for requirements. Stakeholders can be subject to biases and myopic viewpoints as well. It’s true that Henry Ford’s customers would probably have asked for faster horses, it’s also true that, in a way, that’s exactly what he delivered.

We earn our money best when we learn what’s needed and synthesize those needs into an effective solution. That learning is dependent on communication unimpeded by our pride or prejudice: