Leadership Patterns and Anti-Patterns – The Growler

Grizzly Bear Attack Illustration

Prior to starting my career in IT (twenty years ago this month…seems like yesterday), I spent a little over eleven years in law enforcement as a Deputy Sheriff. Over those eleven years my assignments ranged from working a shift in the jail (interesting stories), to Assistant Director of the Training Academy, then Personnel Officer (even more interesting stories), and finally, supervisory and management positions (as many headaches as stories). To say that it was as much an education as a job is to put it mildly. I learned useful lessons about human nature and particularly about leadership.

One of the things that I learned is that leadership and management (they are related, but separate things) have patterns and anti-patterns associated with them. Just like in the realm of software development, it can be difficult to distinguish between what’s a pattern and what’s an anti-pattern (there’s an interesting discussing to be found on this topic in the classic “Big Ball of Mud”). Hammering a square peg into a round hole “works”, albeit sub-optimally. Pattern or anti-pattern?

One pattern/anti-pattern from my time with the Sheriff’s Office is what I call “The Growler”. A high-ranking member of the department was a master of this technique. When approached for something, particularly when the something in question was a signature on a purchase requisition, the default response was a profanity-laced growl (the person in question had retired from the Navy as a senior NCO) demanding to know why he should grant the request. This was extremely daunting, but I learned that the correct response was to growl back. When he growled, “%$@$ a !#&^ing $@!#*. More $%&^ing computer stuff, why the @#*& do you need this?”, I would answer, “You know when you ask me a question and I respond in five minutes instead of three hours”. This would result in a shake of his head, a “Yeah, yeah”, and most importantly, a signature.

More than just an endearing quirk of his character, it was a triage technique. If the person who wanted something tucked tail and ran, it wasn’t important. If, however, the person stood their ground, then he would put forth the effort to make a decision.

Right up front, I should make it clear that I don’t recommend this technique. First and foremost, Human Resources finds “salty” language even less endearing today than they did twenty-five plus years ago, and they weren’t crazy about it then. There’s also a big problem in terms of false negatives.

Most of my coworkers back in my badge and gun days were not shy, retiring types. Consequently, I never saw it backfire for that person. Later on, though, I did see it fail for an IT manager (and yes, while gruff, he was significantly less “salty” than the one at the Sheriff’s Office). This manager had a subordinate who would retreat no matter how valid the need. Consequently, that subordinate’s unit, one that several of us were dependent on, was always under-staffed and under-equipped. When his people attended training, it was because someone else had growled back for him. It was far from the optimal situation.

While not quite as bad as the “shoot the messenger” anti-pattern I touched on recently, “The Growler” comes close. By operating on a principle of fear, you can introduce a gap in your communications and intelligence network that you rely on (whether you know it or not) to get the information you need in a timely manner.

Fear encourages avoidance and no news now can be very bad news later.

Advertisement

Learning Organizations – Shooting the Messenger All the Way to the Fuhrerbunker

Unless you’re living under a rock, it’s a near certainty that you’ve seen at least one Downfall parody video (although I hadn’t realized just how long these had been around until I started working on this post…time flies!). There’s a reason why they’ve managed to hang on as a meme as long as they have. The “shoot the messenger” style of management, in spite of all the weight of evidence against it, is still alive and well.

When Tom Cagley and I were recording the Form Follows Function segment for SPaMCast 407, one of Tom’s questions brought to mind the image of Hitler’s delusional ranting in the bunker made famous by these parodies. The subject of the segment was my post “Learning to Deal with the Inevitable”, which deals with the need for a culture of learning to be able to deal effectively with the change that has become a constant in our world. Tom keyed in on one point (taken from a talk I’d attended put on by Professor Edward Hess): ‘candor, facing the “brutal facts” is essential to a learning culture’. Although his leadership failures pale in significance to the atrocities that he was responsible for, the Hitler portrayed in these clips demonstrates that point vividly.

It should be unnecessary to point out that flying into a rage when given bad news does nothing to change the nature of those events. It is particularly destructive when the bearer of the news is attacked even when blameless for what they’re reporting. Far from helping anything, the temper tantrum ensures that negative information is only delivered when it can no longer be hidden, hampering the ability to react in a timely and effective manner. A vicious circle builds up where delay, spin, and outright deception replace candor.

Delusional, drug-addled dictators can be expected to operate in this manner (thank heavens). The rest of us should aim for better.

It can be inconvenient to have to deal with crises; it’s more inconvenient to find out about them when the situation is unsalvageable. Maturity, humility, and perspective can be difficult character traits to develop, but not as difficult as finding yourself under siege from a world of enemies with only the pathetic dregs of your minions for company.

Innovation, Agility, and the Big Ball of Mud in Meatspace

French infantry in a trench, Verdun 1916

Although the main focus of my blog is application and solution architecture, I sometimes write about process and management issues as well. Conway’s law dictates that the organizational environment strongly influences software systems. While talking with a colleague recently, I stated that I see organizations as systems – social systems operating on “hardware” that’s more complex and less predictable than that which hosts software systems (i.e. people, hence the use of “Meatspace” in the title). The entangling of social and software systems means that we should be aware of the architecture of the enterprise at the least in so far as it will affect the IT architecture of the enterprise.

Innovation and agility are hot topics. Large corporations, by virtue of their very size are at a disadvantage in this respect. In a recent article for Harvard Business Review, “The Core Incompetencies of the Corporation”, Gary Hamel discussed this issue. Describing corporations as “inertial”, “incremental” and “insipid”, he notes that “As the winds of creative destruction continue to strengthen, these infirmities will become even more debilitating”.

The wonderful thing (at least in my mind) about Twitter is that it makes it very easy for two people in the UK and two people in the US to hold an impromptu discussion of enterprise architecture in general and leadership and management issues in particular (on a Saturday morning, no less). Dan Cresswell started the ball rolling with a quote from a Hamel’s article: “most leaders still over-value alignment and conformance and under-value heterodoxy and heresy”. Tom Graves replied that he would suggest that “…heresy is a _necessary_ element of ‘working together’…”. My contribution was that I suspect that “together” is part of the problem; they don’t know how to integrate rebels and followers, therefore the heretics are relegated to a skunkworks or given the sack. Ruth Malan cautioned about limits, noting that “A perpetual devil’s advocate can hold team in perpetual churn; that judgment thing…by which I simply mean, sometimes dissent can be pugnacious, sometimes respectful and sometimes playful; so it depends”.

Ruth’s points re: “…that judgment thing…” and “…it depends” are, in my opinion, extremely important to understanding the issue. I noted that “that judgment thing” was a critical part of management and leadership. This is not in the sense that managers and leaders should only be the ones to exercise judgment, but that they should use their judgment to integrate, rather than eliminate, the heresies so that the organization does not stagnate. There is a need for a “predator”, someone to challenge assumptions, in the management realm as much as there is a need for one in the design and development realm. Likewise, an understanding of “it depends” is key. Neither software systems nor social systems are created and maintained via following a recipe.

While management practices are part of the problem, it’s naive to concentrate on that to the exclusion of all else. Tom Graves is fond of saying, “Things work better when they work together, on purpose”. This is a fundamental point. As he observed in “Dotting the joins (the JEA version)”:

Every enterprise is a system – an ‘ecosystem with purpose’ – constrained mainly by its core vision, values and other drivers. Within that system, everything ultimately connects with everything else, and depends on everything else: if it’s in the system, it’s part of the system, and, by definition, the system can’t operate without it.

The system must be structured to manage, not ignore complexity. Without an intentional design, things fall through the cracks. Tom again, from the same post:

To do something – to do anything, really – we need to know enough to get it to work right down in the detail of real-world practice. When there’s a lot of detail to learn, or a lot of complexity, we specialise: we choose one part of the problem, one part of the context, and concentrate on that. We get better at doing that one thing; and then better; and better again. And everyone can be a specialist in something – hence, given enough specialists, it seems that between us we should be able to do anything. In that sense, specialisation seems to be the way to get things done – the right way, the only way.

Yet there’s a catch. What specialisation really does is that it clusters all of its attention in one small area, and all but ignores the rest as Somebody Else’s Problem. It makes a dot, somewhere within what was previously a joined-up whole. And then someone else makes their own dot, and someone else carves out a space to claim to make their dot. But there’s nothing to link those dots together, to link between the dots – that’s the problem here.

Hamel’s use of the word “incremental” points the way to diagnosing the problem – enterprises have grown organically, rather than springing to life fully formed. Like a software system that has grown by sticking on bits and pieces without refactoring, social systems can become an example of Foote and Yoder’s “Big Ball of Mud” as well. Uncoordinated changes made without considering the larger system leads to a sclerotic mess, regardless of whether the system in question is social or software. My very first post on this blog, “Like it or not, you have an architecture (in fact, you may have several)”, sums it up. The question, is whether that architecture is intentional or not.

Shut up, salute & soldier on?

Yes boss

Leadership and management are currently hot topics, with the #NoManager movement among the hottest of the hot. My detailed opinion on flat organizations/holacracy is a post for another day, but one aspect that I fully agree with is the differentiation between leadership and management. They can and should coincide, but they don’t always. Most importantly, the number of leaders should exceed the number of managers. To re-state a point I made in “Lord of the Repository”, the best managers develop their team members so that the team is never without leadership, even when the manager is away.

Tony DaSilva’s recent post on the subject, “In Defense of Hierarchy”, spoke to some benefits that derive from hierarchies. One of those benefits identified was “orderly execution of operations”, supported by the following quote:

Imagine if students argued with their teachers, workers challenged their bosses, and drivers ignored traffic cops anytime they asked them to do something they didn’t like. The world would descend into chaos in about five minutes. – Duncan J. Watts

For each of Watts’ examples, his point is, in order, wrong, possibly wrong, and correct. I have experience that speaks to all three.

My career in software development is my second career; my first was in law enforcement, serving as a Deputy in a county Sheriff’s Office. One of the positions I held during my tenure there was Assistant Director of the Training Academy. My role was split between administration, instruction, and supervision of students (limited strictly to their time in training, I didn’t hold a supervisory rank that would apply beyond that). My position was then and has always been, that any trainer or teacher that cannot tolerate respectful, appropriate challenge is unworthy of their position. A student probing and testing the information being presented was something to be celebrated in my opinion, not discouraged.

An alert went out on the radio one afternoon that there was a fire in one of the housing units of the jail. After a quick run to the location, I found that the supervisors had succeeded in getting the fire knocked down, removing the person who had started it and detailed staff to evacuate the other inmates to a smoke-free secure area. However, the remaining staff were milling about without protective gear or spare fire extinguishers should the embers flare back up. While waiting for someone with authority, I took it upon myself to direct individuals to get the equipment that was needed. Once someone arrived and assumed control, I then headed back to normal duties.

While talking about the incident later on with a co-worker, they happened to mention that they were really shocked when I ordered the Major to go retrieve an air pack and he did so (n.b. the Major in question was the third ranking person in the department and five levels higher than me in the hierarchy). Needless to say, I was just as shocked. I hadn’t been paying attention to much beyond getting the situation safely under control and the Major hadn’t objected, so I didn’t notice the real-life inversion of control, though my colleague certainly did.

That incident illustrates several things about leadership. First, is the point I mentioned above that leadership and management/authority are separate things. I had no official authority, but exercised leadership until someone with authority was in a position to take over. Second, is that my unofficial authority rested on the trust and acquiescence of those executing my orders. I would argue that, far from undermining their trust, my openness to challenge in non-emergency situations made them more likely to follow me in the emergency.

So, to return to Watts’ examples – teachers should be challenged (appropriately), cops should be obeyed (until the emergency is in hand), and both you and the boss should be able to flex based on the whether the current situation requires a teacher or a cop.

Participative leadership is more likely to engender trust and buy-in. Smart leaders (be they managers, architects, team leads, etc.) aren’t looking for passive followers, they know it could cost them. As Tom Cagley observed in a his post “It Takes A Team”:

While a product owner prioritizes and a scrum master facilitates, it takes a whole team to deliver. The whole team is responsible for getting the job done which means that at different times in different situations different members will need to provide leadership. Every team member brings their senses to the project-party, which makes all of them responsible looking for trouble and then helping to resolve it even if there isn’t a scrum master around.

Lord of the Repository

The man on horseback

In Robert “Uncle Bob” Martin’s “Where is the Foreman”, he advocated for a “foreman” with exclusive commit rights who would review each and every potential commit before it made its way into the repository in the interest of ensuring quality. While I am in sympathy with some of his points, ultimately the idea breaks down for a number of reasons, most particularly in terms of introducing a bottleneck. A single person will only be able to keep up with so many team members and if a sudden bout of the flu can bring your operation to a standstill, there’s a huge problem.

Unlike Jason Gorman, I believe that egalitarian development teams are not the answer. When everyone is responsible for something, it is cliche that nobody takes responsibility for it (they’ve even given the phenomena its own name). However, being responsible for something does not mean dictating. Dictators eventually tend to fall prey to tunnel vision.

Jason Gorman pointed out in a follow-up post, “Why Code Inspections Need To Be Egalitarian”, “You can’t force people, con people, bribe people or blackmail them into caring.” You can, however, help people to understand the reasons behind decisions and participate in the making of those decisions. Understanding and participation are more conducive to ownership and adoption than coercion. Promoting ownership and adoption of values vital to the mission is the essence of leadership.

A recent Tweet from Thomas Cagley illustrates the need for reflective, purposeful leadership:

In my experience, the best leaders exercise their power lightly. It’s less a question of what they can decide and more a question of should they decide out of hand. When your philosophy is “I make the decisions”, you make yourself a hostage to presence. Anywhere you’re not, no decision will be made, regardless of how disastrous that lack of action may be. I learned from an old mentor that the mark of a true leader is that they can sleep when they go on vacation. They’re still responsible for what happens, but they’ve equipped their team to respond reasonably to issues rather than to mill about helplessly.

In his follow-up post, “Oh Foreman, Where art Thou?”, Uncle Bob moderated his position a bit, introducing the idea of assistants to help in the reviews and extension of commit rights to those team members who had proved trustworthy. It’s a better position than the first post, but still a bit too controlling and self-certain. The goal should not be to grow a pack of followers who mimic the alpha wolf, but to grow the predators who snap at your heals. This keeps them and just as important, you, on the path of learning and growth.

Plans, Planning, and Pivots

There is no magic to planning.

(originally posted on CitizenTekk)

According to Dwight D. Eisenhower, “…plans are useless but planning is indispensable”. How can the production of something “useless” be “indispensable”?

The answer can be found on a banner recently immortalized on Bulldozer00’s blog: “React Less…PLAN MORE!”. Unpacking this is simple – the essence of planning is to decide on responses to events that have yet to occur, without the stress of a time crunch. Gaining time to analyze a response and reducing the emotional aspects should lead to better decisions than ones made on the fly and under pressure. The problem we run into, however, is that reality fails to coincide with our plans for very long.

As Colin Powell observed, “No battle plan survives contact with the enemy”. Detailed, long-term plans can quickly become swamped by complexity as the tree of options branches out. Making assumptions about expected outcomes can prune the number of branches, but each assumption becomes a risk that an unexpected event invalidates the plan. The key is to find a middle ground between operating completely ad hoc on the one hand and having to be Nostradamus on the other.

Planning at the proper scope is one tool to help avoid problems. As noted above, plans with deep detail and long durations are brittle due to complexity and/or the difficulty in making predictions. Like any other view, plans should be more detailed in the foreground and fuzzier in the distance. Much more than a general path to your desired destination will likely turn out to be wasted effort. Only that planning that promotes success is needed. There’s no magic inherent in planning that justifies a belief in “more equals better”. Fitness for purpose should be the metric rather than pure quantity of detail.

Another benefit to avoiding useless detail is that it makes it easier to abandon a plan when it no longer makes sense. Humans tend to value that which they’ve invested time in. In execution, commitment is a virtue right up until the point it ceases to be. Hanging on to a plan past that point can be expensive. Having the flexibility to pivot to a new plan can make the difference between success and failure.

Form Follows Function on SPaMCAST

The latest episode (#268) of Tom Cagley’s excellent series of podcasts features an interview with me on the subjects of architecture, process, and management, as well as why I blog. It was not only an honor to be asked, but also a very enjoyable half hour of conversation on subjects near and dear to me – well worth the time it takes to listen to (in my not so humble opinion).

No, Uncle Bob, No – the Obligatory healthcare.gov Post

Good for what ails you?

I tried to avoid this one. First of all, I don’t do politics on this site and this topic has way too much political baggage. Second, a great many people have already written about it, so I didn’t think I really had anything to add.

Then, Uncle Bob Martin chimed in.

I agree with some of what he has to say. I have no doubt that this particular debacle has harmed the image of software development in the eyes of the general public. Then he falls over the edge, comparing the launch of healthcare.gov with the Challenger disaster. After all, in both cases, political considerations overrode technical concerns. Regardless of this, Bob puts the blame on those far down the ladder:

Perhaps you disagree. Perhaps you think this was a failure of government, or of management. Of course I agree. Government failed and management failed. But government and management don’t know how to build software. We do. We were hired because of that knowledge. And we are expected to use that knowledge to communicate to the managers and administrators who don’t have it.

The thing is, the Centers for Medicare and Medicaid Services (CMS) is both a government agency and the system integrator on the healthcare.gov project. While there’s plenty of evidence of really poor code across the various parts, the integration of those parts is where the project fell down. Had the various contractors hired numerous Bob Martin clones and obtained the cleanest of clean code, the result would have still been the same.

Those with the technical knowledge and experience are, without a doubt, obligated to provide their best advice to the managers and administrators. When those managers and administrators ignore that advice, it is incorrect to allege that the fault lies elsewhere.

The end of the post, however, is the worst:

So, if I were in government right now, I’d be thinking about laws to regulate the Software Industry. I’d be thinking about what languages and processes we should force them to use, what auditing should be done, what schooling is necessary, etc. etc. I’d be thinking about passing laws to get this unruly and chaotic industry under some kind of control.

If I were the President right now, I might even be thinking about creating a new Czar or Cabinet position: The Secretary of Software Quality. Someone who could regulate this misbehaving industry upon which so much of our future depends.

Considering that all indications are that the laws and regulations around government purchasing and contracting contributed to this mess, I’m not sure how additional regulation is supposed to fix it. Likewise, it’s a little boneheaded to suggest that those responsible for this debacle (by attempting to manage what they should have known they were unqualified to manage) should now regulate the entire software development industry. For a fact, the very diversity of the industry should make it obvious that a one-size-fits-all mandate would make matters irretrievably worse.

Handing out aspirin to treat Ebola is just bad medicine.

Mixed Signals and Messed-Up Metrics

yes...no...um, maybe?

Dentistry has made me a liar.

One of my tasks each morning is to make sure my youngest son brushes his teeth. Someone, somewhere decided that two minutes of tooth brushing will ensure optimal oral hygiene, which target has been transmitted by our dentist to our very bright, but very literal six year-old. Every morning when he has thoroughly brushed and I give him the go-ahead to rinse, he asks “Dad, was that two minutes?”, to which I reply “yes, yes it was”, regardless of how long it took. I’m a horrible person, yes, but trying to explain the nuances to him at this age would be a pain on par with trimming my nails with a chainsaw – the lie works out better for all involved.

My daily moral dilemma has a very common source – metrics are frequently signals of a condition, rather than the condition itself. When you cannot use a direct measure (e.g. number of widgets per hour), it’s usual to substitute a proxy that indicates the desired condition (at least that’s the plan). A simplistic choice, however, will lead to metrics that fail to hold true. Two minutes spent brushing well should yield very good results, however, inadequate brushing, no matter how long you spend doing it, will always be inadequate. This is a prime example of what Seth Godin termed “…measuring what’s easy to measure as opposed to what’s important”.

Examples of these types of measures in software development are well-known:

  • Lines of Code: Rather than productivity, this just measure typing. Given two methods equal in every other way, would the longer one be better?
  • Bug Counts: Not all bugs are created equal. One substantive bug can outweigh thousands of cosmetic ones.
  • Velocity/Turn Time: Features and (again) bugs are not created equal. Complexity, both business and technical, as well as clarity of the problem, tend to be have more impact on time to complete than effort expended or size.

As John Sonmez noted in “We Can’t Measure Anything in Software Development”: “We can track the numbers, but we can’t draw any good conclusions from them.”

There are a number of reasons these measures are unreliable. First, is the tenuous ties between the measures and what they hope to represent as noted above. Second, is the phenomenon known as Goodhart’s law: “When a measure becomes a target, it ceases to be a good measure”. In essence, when people know that a certain number is wanted/expected, the system will be gamed to achieve that number. Most importantly, however, is that value is the desired result, not effort. In manufacturing, more widgets per hour means greater profits (assuming sufficient demand). For software development, excess production can likely yield excess risk.

None of this is to suggest that metrics, including those above, are useless. What is important, however, is not the number, but what the number signals (particularly when metrics are combined). Increasing lines of code over time coupled with increasing bug counts and/or decreasing velocity may signal increased complexity/technical debt in your codebase, allowing you to investigate before things reach critical mass. Capturing the numbers to use as an early warning mechanism will likely bear much more fruit than using them as a management tool, where they likely become just a lie we tell ourselves and others.