Systems Thinking Complicates Things

4th UK Rock Paper Scissors Championships by James Bamber via Wikimedia

 

I’ve had the honor and pleasure of appearing as a regular on Tom Cagley‘s SPaMCast podcast for almost three years now. Before I write one of my “Form Follows Function on SPaMCast x” posts, I always listen to the podcast to make sure that the summary is right (the implication being, relying purely on my memory won’t be right). I got a bonus while writing up last week’s appearance, because Tom asked an excellent question that deserved its own post: does thinking about a problem (legacy systems, in the instance of last week’s discussion) holistically/systematically complicate things?

Abso-freakin’-lutely.

It is much easier to avoid all the twists and turns and possibilities inherent in systems thinking. A simpler approach, picking one lever to pull/one button to push, makes it much easier to come up with a solution.

It just doesn’t work very well at coming up with solutions that actually work.

When there is a mismatch in complexity between problem and solution architectures, this mismatch will be an additional problem to deal with. This will apply when the solution is more complex than the problem space warrants and when the opposite is the case. Solutions that fail to account for the context they will encounter are vulnerable. This is the idea behind the quote attributed to Albert Einstein: “Everything should be made as simple as possible, but not simpler.”

Human nature can push us to fix problems quickly, and quick will generally equate to simple. It takes time to analyse the angles and consider the alternatives. How often have you seen people ask for “the best” way to do something absent any context? How often have you seen people ask “why would someone ever do that?”

I’ll answer that by asking 3 questions:

  • since Rock beats Scissors, why would anyone ever choose Scissors?
  • since Paper beats Rock, why would anyone ever choose Rock?
  • since Scissors beats Paper, why would anyone ever choose Paper?

Reality isn’t binary. It’s not what’s “best”, it’s what’s fit for purpose in a given context and there are lots and lots of contexts out there.

This isn’t to say that all quick, simple interventions are wrong. If you find yourself in a house fire, more action and less comprehensive deliberation may well be in order. The key is matching the cost (largely in terms of time) of defining the problem space with cost (in terms of both effort and risk that the intervention adds to the problem) of crafting the solution.

Rock, Paper, Scissor, Lizard, Spock rules diagram

It’s almost guaranteed that the system contexts we deal with (both technical and social) will evolve toward more and more complexity. Surprises will emerge as a matter of course. We don’t need to make more by failing to take a more holistic view when we have the time to do so.

Advertisement

Innovation in Inner Space

KGL dragoons at the Battle of Garcia Hernandez

 

Long-time readers know that I have a rather varied set of interests and that I’ve got a “thing” for history, particularly military history. Knowing that, it shouldn’t come as a surprise that I was recently reading an article titled “Cyber is the fourth dimension of war” (ground, sea and air being the first three dimensions). It’s not a bad article, but it is mistaken. Cyberwar is the fifth dimension of war. The first dimension, today and for all of time past, is the human mind. Contests are won or lost, not on some field of battle, virtual or physical, but in the minds of the combatants. For example, if you believe you’ve lost, then you have.

The painting shown above illustrates this nicely. During the Napoleonic period, infantry that was charged by cavalry would form a square, presenting a hedge of bayonets to all sides. Horses, being intelligent creatures, will not impale themselves on pointy things, thus the formation provides protection to the infantry who were free to fire at the encircling cavalry. Charging disciplined, unbroken infantry was a losing proposition for the cavalry under almost all circumstances. Note the use of “almost”.

At the Battle of García Hernández, July 1812, something unusual happened. One French formation was late in firing, and a wounded horse ran blindly into the square, breaking it up. The attacking British (Hannoverian, to be precise) cavalry rode into the gap and forced the surrender of the French infantry that comprised it. This, of course, was simply a matter of physics. However, two further squares broke up when charged due to the effect of what happened to the first one on their morale. Believing they were beaten, they failed to maintain cohesion and their anticipated defeat became a reality.

So, what’s the point?

Greger Wikstrand and I have been trading posts on the topic of innovation since late 2015. Greger’s latest, “Spring clean your mind”, deals with the concepts of infowar and propaganda (aka “fake news”). This is another example of what Greger’s written about in the past, a concept he dubbed black hat innovation: “Whenever there is innovation or invention there is also misuse”.

Whether you call it black hat innovation or abuse cases (my term), it’s a concept we need to be aware of. It is a concept that affect us, not just as technologists, but as ordinary human beings. We need to be aware of the potential for active abuse. We also need to be aware of the potential for problems that caused by things that make our life more convenient or more pleasant:

This isn’t to say that Facebook is some evil empire, but that we need to bear some responsibility for not allowing ourselves to become trapped in an echo chamber:

It’s something we need to take responsibility for. We can’t hope for a technological deus ex machina to bail us out. As Tim Bass recently noted on his Cyberspace Event Processing Blog:

The big “AI” processing “pie in the sky” plan for cyber defense we all read about is not going to work “as advertised” because we cannot program machines to solve problems that we cannot solve ourselves. There is no substitute for the advancement and development of the human mind to solve complex problems. Delegating the task of “thinking” to machines is doomed to fail, and fail “big time”. It seems like humanity has, in a manner of speaking, “given up” on humans developing the intelligence to manage and defend cyberspace, so they have decided to turn it all over to machines.

Wrong approach!

The right approach, in my opinion, is to be intentional and active in learning. Consuming information should not be a matter of sitting back and shoveling it in, but one of filtering, testing, and appraising. How much time do you spend reading viewpoints you absolutely disagree with? How much time do you spend exploring information?

In 1645, as he was looking back at his long and successful career as a samurai, where a single loss often meant death, Miyamoto Musashi concluded that although rigorous sword practice was essential, it wasn’t enough. At the end of the first chapter of A Book of Five Rings, he also admonishes aspiring warriors to “Cultivate a wide variety of interests in the arts” and “Be knowledgable in a wide variety of occupations.”

Similarly, Boyd, who was was a keen student of Musashi, described his method as looking across a wide variety of fields — “domains” he called them — searching for underlying principles, “invariants.” He would then experiment with syntheses involving these principles until he evolved a solution to the problem he was working on. Because they involved bits and pieces from a variety of domains, he called these syntheses “snowmobiles” (skis, handlebar from a bicycle, etc.)

 

Perception is critical. We are made or unmade, less by our circumstances and more by our perception of them. Companies that have suffered disruptions have done so not because they were unable to respond, but because they either believed themselves invulnerable or believed themselves incapable. Likewise, as individuals, we have control over what information we expose ourselves to and how we manage that exposure.

Sense-making is a critical skill that requires active involvement. The passive get passed by.

[Painting of the battle of Garcia Hernandez by Adolf Northen, housed in the Landesmuseum Hannover. Photo by Michael Ritter via Wikimedia Commons]

Stopping Accidental Technical Debt

Buster Keaton looking at a poorly constructed house

In one of my earlier posts about technical debt, I differentiated between intentional debt (that taken on deliberately and purposefully) and accidental debt (that which just accrues over time without rhyme or reason or record). Dealing with (in the sense of evaluating, tracking, and resolving it) technical debt is obviously a consideration for someone in an application architect role. While someone in that role absolutely should be aware of the intentional debt, is there a way to be more attuned to the accidental debt as well?

Last summer, I published a post titled “Distance…is the one true enemy…”. The post started with a group of tweets from Gregory Brown talking about the corrosive effects of distance on software development (distance between compile and run, between failure and correction, between development and feedback, etc.). I then extended the concept to management, talking about how distance between sense-maker and decision-maker could negatively affect the quality of the decisions being made.

There’s also a distance that neither Greg nor I covered at the time, design distance. Design distance is the distance between the design and the outcome. Reducing design distance makes it easier to keep a handle on the accidental debt as well as the intentional.

Distance between the architectural decisions and the implementation can introduce technical debt. This distance can come from remote decision-makers, architecture pigeons who swoop in, deposit their “wisdom”, and then fly away home. It can come from failing to communicate the design considerations effectively across the entire team. It can also come from failing to monitor the system as it evolves. The design and the implementation need to be in alignment. Even more so, the design and the implementation need to align with particular problems to be solved/jobs to be done. Otherwise, the result may look like this:

Distance between development of the system and keeping the system running can introduce technical debt as well. The platform a system runs on is a vital part of the system, as critical as the code it supports. As with the code, the design, implementation, and context all need to be kept in alignment.

Alignment of design, implementation, and context can only be maintained by on-going architectural assessment. Stefan Dreverman’s “Using Philosophy in IT architecture” identified four questions to be asked as part of an assessment:

  1. “What is my purpose?”
  2. “What am I composed of?”
  3. “What’s in my environment?”
  4. “What do I communicate?”

These questions are applicable not only to the beginning of a system, but throughout its life-cycle. Failing to re-evaluate the architecture as a whole as the system evolves can lead to inconsistencies as design distance grows. We can get so busy dealing with the present that we create a future of pain:

At first glance, this approach might seem to be expensive, but rewriting legacy systems is expensive as well (assuming the rewrite would be successful, which is a tenuous assumption). Building applications with a one-and-done mindset is effectively building a legacy system.

Square Pegs, Round Holes, and Silver Bullets

Werewolf

People like easy answers.

Why spend time analyzing and evaluating when you can just take some thing or some technique that someone else has already put to use and be done with it?

Why indeed?

I mean, “me too” is a valid strategy, right?

And we don’t want people to get off message, right?

And we can always find a low cost, minimal disruption way of dealing with issues, right?

I mean, after all, we’ve got data and algorithms, and stuff:

The thing is, actions need to make sense in context. Striking a match is probably a good idea in the dark, but it’s probably less so in daylight. In the presence of gasoline fumes, it’s a bad idea regardless of ambient light.

A recent post on Medium, “Design Sprints Are Snake Oil” is a good example. Erika Hall’s title was a bit click-baitish, but as she responded to one commenter:

The point is that the original snake oil was legitimate and effective. It ended up with a bad reputation from copycats who over-promised results under the same name while missing the essential ingredients.

Sprints are legitimate and effective. And now there is a lot of follow-up hype treating them as a panacea and a replacement for other types of work.

Good things (techniques, technologies, strategies, etc.) are “good”, not because they are innately right, but because they fit the context of the situation at hand. Those that don’t fit, cease being “good” for that very reason. Form absent function is just a facade. Whether it’s business strategy, management technique, innovation efforts, or process, there is no recipe. The hard work to match the action with the context has to be done.

Imitation might be the sincerest form of flattery, but it’s a really poor substitute for strategy.

Managing Fast and Slow

Tortoise and Hare Illustration

People have a complicated relationship with the concept of cause and effect. In spite of the old saying about the insanity of doing the same old thing looking for a different result, we hope against hope that this time it will work. Sometimes we inject unnecessary complexity into what should be very simple tasks, other times we over-simplify looking for shortcuts to success. Greger Wikstrand recently spoke to one aspect of this in his post “Cargo cult innovation, play buzzword bingo to spot it” (part of our ongoing conversion on innovation):

I am not saying that there is no basis of truth in what they say. The problem is that innovation is much more complex than they would have you believe. If you fall for the siren song of cargo cult innovationism, you will have all the effort and all the trouble of real innovation work but you will have none of the benefits.

I ran across an interesting example of this kind of simplistic thought not long ago on Forbes, titled “The Death of Strategy”, by Bill Fischer:

Strategy is dead!

Or, is it tactics?

In a world of never-ending change, it’s either one or the other; we can no longer count on having both. As innovation accelerates its assault on what we formerly referred to as “our planning process,” and as S-curves accordingly collapse, each one on top another, time is compressed. In the rubble of what is left of our strategy structure, we find that what we’ve lost is the orderly and measured progression of time. Tim Brown, of IDEO, recently put it this way at the Global Peter Drucker Forum 2016, in Vienna: “So many things that used to have a beginning, a middle and an end, no longer have a middle or an end.” Which is gone: strategy or tactics? And, does it matter?

Without a proper middle, or end, for any initiative, the distinction between strategy and tactics blurs: tactics become strategy, especially if they are performed in a coherent and consistent fashion. Strategy, in turn, now takes place in the moment, in the form of an agglomeration of a series (or not) of tactics.

The pace of change certainly feels faster than ever before (I’m curious, though, as to when the world has not been one of “never-ending change”). However, that nugget of truth is wrapped in layers of fallacy and a huge misunderstanding of the definitions of “tactics” and “strategy”. “Tactical and Strategic Interdependence”, a commentary from the Clausewitzian viewpoint, contrasts the terms in this manner:

Both strategy and tactics depend on combat, but, and this is their essential difference, they differ in their specific connection to it. Tactics are considered “the formation and conduct of these single combats in themselves” while strategy is “the combination of them with one another, with a view to the ultimate object of the War.”[8] Through the notion of combat we begin to see the differentiation forming between tactics and strategy. Tactics deals with the discrete employment of a single combat, while strategy handles their multiplicity and interdependence. Still we need a rigorous conception. Clausewitz strictly defines “tactics [as] the theory of the use of military forces in combat,” while “Strategy is the theory of the use of combats for the object of the War.”[9] These definitions highlight the difference between the means and ends of tactics and strategy. Tactics considers the permutations of military forces, strategy the combinations of combats, actual and possible.

In other words, tactics are the day to day methods you use to do things. Strategy is how you achieve your long term goals by doing the things you do. Tactics without strategy is a pile of bricks without an idea of what you’re going to build. Strategy without tactics is an idea of what to build without a clue as to how you’d build it.

Fischer is correct that strategy executed is the “…agglomeration of a series (or not) of tactics”, but his contention that it “…now takes place in the moment…” is suspect, predicated as it is on the idea that things suddenly lack “…a proper middle or end…”. I would argue that any notion of a middle or end that was determined in advance rather than retroactively, is an artificial one. Furthermore, the idea that there are no more endings due to the pace of change is more than a little ludicrous. If anything, the faster the pace, the more likely endings become as those who can’t keep up drop out. Best of all is the line “…tactics become strategy, especially if they are performed in a coherent and consistent fashion”. Tactics performed in “…a coherent and consistent fashion” is pretty much the definition of executing a strategy (negating the premise of the article).

Flailing around without direction will not result in innovation, no matter how fast you flail. While change is inevitable, innovation is not. Innovating, making “significant positive change”, is not a matter of doing a lot of things fast and hoping for the best. Breakthroughs may occasionally be “happy accidents”, but even then are generally ones where intentional effort has been expended towards making them likely.

In today’s business environment, organizations must be moving forward just to maintain the status quo, much less innovate. This requires knowing where you are, where you’re headed, and what obstacles you’re likely to face. This assessment of your operating context is known as situational awareness. It’s not simple, because your context isn’t simple. It’s not a recipe, because your context is ever-changing. It’s not a product you can buy nor a project you can finish and be done with. It’s an ongoing, deliberate process of making sense of your context and reacting accordingly.

Situational awareness exists on multiple levels, tactical through strategic. While the pace of change is high, the relative pace between the tactical and strategic is still one of faster and slower. Adjustments to strategic goals may come more frequently, but daily changes in long-term goals would be a red flag. Not having any long-term goals would be another. Very specific, very static long-range plans are probably wasted effort, but having some idea of what you’ll be doing twelve months down the road is a healthy sign.

Situational Awareness – Where does it begin? Where does it end?

Infinity symbol

Situational awareness, according to Wikipedia, is defined as “…the perception of environmental elements and events with respect to time or space, the comprehension of their meaning, and the projection of their status after some variable has changed, such as time, or some other variable, such as a predetermined event”. In other words, it’s having a handle on what currently is and what is about to happen. It’s a concept that is invaluable to a wide range of interests, particularly management/leadership, architectural design, and innovation. It’s a concept that crosses levels, from tactical to strategic. Just as socio-technical systems architectures exist in a fractal space (application to solution to enterprise), so too does the concept of situational awareness. As such, it’s a common theme for this site, particularly over the last year or so.

The OODA (Observe-Orient-Decide-Act) Loop, developed by Air Force Colonel John Boyd, is a framework for decision-making that explicitly incorporates situational awareness:

OODA Loop Diagram

Coupling sense-making with decision-making is critical to achieve a balance of both speed and effectiveness. In my opinion, acting without taking the state of the environment into account is a recipe for disaster. Equally important (likewise, in my opinion), is understanding the dynamic nature of situational awareness. As Boyd’s diagram above shows, it’s not a linear process. Additionally, the very nature of a loop should convey the fact that there’s neither beginning nor end. This is a key concept.

One of the sites that I follow is Slightly East of New, which is run by an associate of Boyd’s and dedicated to his theories. A recent post on that site, “The magic of the OODA loop”, related a paragraph from a sci-fi novel, The Apocalypse Codex, that referred to OODA:

Observe, orient, decide, act: words to live or die by. Right now, Persephone is disoriented — on the run, cut off. It’s time to go on the offensive, work out where she is and what’s going on, then get the hell out of this trap.

It was an interesting post, but nothing noteworthy, until I got to this comment:

I find the phrase, “…on the run, cut off.” very interesting, within the context of “disoriented”. To me, “on the run” mean a decision has been made and acted on, whereas “disorientation” usually means that one can’t make a decision.
Likewise, “cut off” is the position you find yourself in, after all the decisions have been made and, after thinking about it, it is the posture you observe yourself to be in.
In other words, on the run and cut off is not really a disorientation, but a reality.
So, while you may not survive, you have made a decision to run or you are about to make a decision and join the otherside.
I suppose it just depends on where those words show up in the narrative, as to if you made the decision or your competitor made the decision for you.

I may be over-sensitive to the phrasing, but “…decision has been made and acted on…” and “…after all the decisions have been made…” strike me as being too static and too linear. Every action/inaction follows on decision/indecision. The point “…after all the decisions have been made…” is terminal (for the person who has made all the decisions they will make). In my opinion, it is key to bear in mind that the clock is always running and that the reality being processed is already past. Too much attention to the state of what is (or rather, was) takes away from the more important task of getting to a better “to be” state. Additionally, decisions and contexts should be thought of as not just linear, but fractal (e.g. having multiple levels from tactical through strategic) as well.

Loops that have an end are no longer loops. Likewise, we have to be able to strike a balance between just focusing on what’s relevant (too much context/backstory can cause information overload) and the point where we’ve trimmed away necessary context.

Actively thinking about sense-making and decision-making can seem overly academic. The activities are so foundational to nearly everything that they can feel instinctual rather than learned. I suspect that’s a case of “familiarity breeds contempt”. Depending on the application, contempt for developing the best possible situational awareness could be fatal.

[OODA Loop diagram by Patrick Edwin Moran via Wikimedia Commons]

Learning Organizations: When Wrens Take Down Wolfpacks

A Women's Royal Naval Service plotter at work in the Operations Room at Derby House in Liverpool, the headquarters of the Commander-in-Chief Western Approaches, September 1944.

What does the World War II naval campaign known as the Battle of the Atlantic have to do with learning and innovation?

Quite a lot, as it turns out. Early in the war, Britain found itself in a precarious position. While being an island nation provided defensive advantages, it also came with logistical challenges. Food, armaments, and other vital supplies as well as reinforcements had to come to it by sea. The shipping lanes were heavily threatened, primarily by the German u-boat (submarine) fleet. Needing more than a million tons of imports per week, maintaining the flow of goods was a matter of survival.

Businesses may not have to worry about literal torpedoes severing their lifelines, but they are at risk due to a number of factors. Whether its changing technology or tastes, competitive pressures, or even criminal activity, organizations cannot afford to sit idle. In his post “Heraclitus was wrong about innovation”, Greger Wikstrand talked about the mismatch between the speed of change (high) and rate of innovation (not fast enough). This is a recurrent theme in our ongoing discussion of innovation (we’ve been trading posts on the subject for over a year now).

The British response to the threat involved many facets, but an article I saw yesterday about one response in particular struck a chord. “The Wargaming “Wrens” of the Western Approaches Tactical Unit” told the story of a group of officers and ratings of the Women’s Royal Naval Service (nicknamed “Wrens”) who, under the command of a naval officer, Captain Gilbert Roberts, revolutionized British anti-submarine warfare (ASW). Their mandate was to “…explore and evaluate new tactics and then to pass them on to escort captains in a dedicated ASW course”.

Using simulation (wargaming) to develop and improve tactics was an unorthodox proposition, particularly in the eyes of Admiral Percy Noble, who was responsible for Britain’s shipping lifeline. However, Admiral Noble was capable of appreciating the value of unorthodox methods:

A sceptical Sir Percy Noble arrived with his staff the next day and watched as the team worked through a series of attacks on convoy HG.76. As Roberts described the logic behind their assumptions about the tactics being used by the U-Boats and demonstrated the counter move, one that Wren Officer Laidlaw had mischievously named Raspberry, Sir Percy changed his view of the unit. From now on the WATU would be regular visitors to the Operations Room and all escort officers were expected to attend the course.

Each of the courses looked at ASW and surface attacks on a convoy and the students were encouraged to take part in the wargames that evaluated potential new tactics. Raspbery was soon followed by Strawberry, Goosebery and Pineapple and as the RN went over to the offensive, the tactical priority shifted to hunting and killing U Boats. Roberts continued as Director of WATU but was also appointed as Assistant Chief of Staff Intelligence at Western Approaches Command.

This type of learning culture, such as I described in “Learning to Deal with the Inevitable”, was key to winning the naval war. Clinging to tradition would have led to a fatal inertia.

One aspect of the WATU approach that I find particularly interesting is the use of simulation to limit risk during learning. Experiments involving real ships cost real lives when they don’t pan out. Simulation (assuming sufficient validity of the theoretical underpinnings of the model used) is a technique that can be used to explore more without sending costs through the roof.

Strategic Tunnel Vision

Mouth of a Tunnel

 

Change and innovation are topics that have been prominent on this blog over the last year. In fact, Greger Wikstrand and I have traded a total of twenty-six posts (twenty-seven counting this one) on the subject.

Greger’s last post, “Successful digitization requires focus on the entire customer experience – not just a neat app” (it’s in Swedish, but it translates well to English), discussed the critical nature of customer experience to digital innovation. According to Greger, without taking customer experience into account:

One can make the world’s best app without getting more, more satisfied and profitable customers. It’s like trying to make a boring games more exciting by spraying gold paint on the playing pieces.

Change and innovation are not the same thing. Change is inevitable, innovation is not (with a h/t to Tom Cagley for that quote). As Greger pointed out in his latest article, to get improved customer experience, you need depth. Sprinkling digital fairy dust over something is not likely result in innovation. New and different can be really great, but new and different solely for the sake of new and different doesn’t win the prize. Context is critical.

If you’ve read more than a couple of my posts, you’ve probably realized that among my rather varied interests, history is a major one. I lean heavily on military history in particular when discussing innovation. This post won’t break with that tradition.

The blog Defense in Depth, operated by the Defence Studies Department, King’s College London, has published two posts this week dealing with the Suez Crisis of 1956, primarily in terms of the Anglo-French forces. One deals with the land operations and the other with naval operations. They struck a chord because they both illustrated how an overreaction to change can have drastic consequences from the strategic level down to the tactical.

Buying into a fad can be extremely expensive.

The advent of the nuclear age at the end of World War II dramatically transformed military and political thought. The atomic bomb was the ultimate game-changer in that respect. In the time-honored tradition, the response was over-reaction. “Atomic” was the “digital” of the late 40s into the 60s. They even developed a recoilless gun that could launch a 50 pound nuclear warhead 1.25-2.5 miles. “Move fast and break things” was serious business back in the day.

This extreme focus on what had changed, however, led to a rather common problem, tunnel vision. Nuclear capability became such an overarching consideration that other capabilities were neglected. Due to this neglect of more conventional capabilities, the UK’s forces were seriously hampered in their ability to perform their mission effectively. Misguided thinking at the strategic level affected operations all the way down to the lowest tactical formations.

It’s easy to imagine present-day IT scenarios that fall prey to the same issues. A cloud or digital initiative given top priority without regard to maintaining necessary capabilities could easily wind up failing in a costly manner and impairing the existing capability. It’s important to understand that time, money, and attention are finite resources. Adding capability requires increasing the resources available for it, either through adding new resources or freeing up existing ones by reducing the commitment to less important capabilities. If there is no real appreciation of what capabilities exist and what the relative value of each is, making this decision becomes a shot in the dark.

Situational awareness across all levels is required. To be effective, that awareness must integrate changes to the context while not losing sight of what already was. Otherwise, to use a metaphor from my high school football days, you risk acting like a “blind dog in a meat-packing plant”.

Form Follows Function on SPaMCast 411

SPaMCAST logo

This week’s episode of Tom Cagley’s Software Process and Measurement (SPaMCast) podcast, number 411, features Tom’s essay on Servant Leadership (which I highly recommened), John Quigley on managing requirements as a part of product management, a Form Follows Function installment based on my post “Organizations as Systems – ‘Uneasy Lies the Head that Wears the Crown'”, and Kim Pries on software craftsmanship.

Tom and I discuss the danger of trying to use simplistic explanations for the interactions that make up complex human systems. No one has the power to force things in a particular direction, rather the direction comes about as a result of the actions and interactions of everyone involved. It might be comforting to believe that there’s one single lever for change, but it’s wrong.

You can find all my SPaMCast episodes using under the SPAMCast Appearances category on this blog. Enjoy!

Leadership Patterns and Anti-Patterns – The Growler

Grizzly Bear Attack Illustration

Prior to starting my career in IT (twenty years ago this month…seems like yesterday), I spent a little over eleven years in law enforcement as a Deputy Sheriff. Over those eleven years my assignments ranged from working a shift in the jail (interesting stories), to Assistant Director of the Training Academy, then Personnel Officer (even more interesting stories), and finally, supervisory and management positions (as many headaches as stories). To say that it was as much an education as a job is to put it mildly. I learned useful lessons about human nature and particularly about leadership.

One of the things that I learned is that leadership and management (they are related, but separate things) have patterns and anti-patterns associated with them. Just like in the realm of software development, it can be difficult to distinguish between what’s a pattern and what’s an anti-pattern (there’s an interesting discussing to be found on this topic in the classic “Big Ball of Mud”). Hammering a square peg into a round hole “works”, albeit sub-optimally. Pattern or anti-pattern?

One pattern/anti-pattern from my time with the Sheriff’s Office is what I call “The Growler”. A high-ranking member of the department was a master of this technique. When approached for something, particularly when the something in question was a signature on a purchase requisition, the default response was a profanity-laced growl (the person in question had retired from the Navy as a senior NCO) demanding to know why he should grant the request. This was extremely daunting, but I learned that the correct response was to growl back. When he growled, “%$@$ a !#&^ing $@!#*. More $%&^ing computer stuff, why the @#*& do you need this?”, I would answer, “You know when you ask me a question and I respond in five minutes instead of three hours”. This would result in a shake of his head, a “Yeah, yeah”, and most importantly, a signature.

More than just an endearing quirk of his character, it was a triage technique. If the person who wanted something tucked tail and ran, it wasn’t important. If, however, the person stood their ground, then he would put forth the effort to make a decision.

Right up front, I should make it clear that I don’t recommend this technique. First and foremost, Human Resources finds “salty” language even less endearing today than they did twenty-five plus years ago, and they weren’t crazy about it then. There’s also a big problem in terms of false negatives.

Most of my coworkers back in my badge and gun days were not shy, retiring types. Consequently, I never saw it backfire for that person. Later on, though, I did see it fail for an IT manager (and yes, while gruff, he was significantly less “salty” than the one at the Sheriff’s Office). This manager had a subordinate who would retreat no matter how valid the need. Consequently, that subordinate’s unit, one that several of us were dependent on, was always under-staffed and under-equipped. When his people attended training, it was because someone else had growled back for him. It was far from the optimal situation.

While not quite as bad as the “shoot the messenger” anti-pattern I touched on recently, “The Growler” comes close. By operating on a principle of fear, you can introduce a gap in your communications and intelligence network that you rely on (whether you know it or not) to get the information you need in a timely manner.

Fear encourages avoidance and no news now can be very bad news later.