Systems of Social Systems and the Software Systems They Create

I’ve mentioned before that the idea of looking at organizations as systems is one that I’ve been focusing on for quite a while now. From a top-down perspective, this makes sense – an organization is a system that works better when it’s component parts (both machine and human) intentionally work together.

It also works from the bottom up. For example, from a purely technical perspective, we have a system:

Generic System

However, without considering those who use the system, we have limited picture of the context the system operates within. The better we understand that context, the better we can shape the system to fit the context, otherwise we risk the square peg in a round hole situation:

Generic System with Users

Of course, the users who own the system are also only a part of the context. We have to consider the customers as well:

Generic System with Users and customers

Likewise, we need to consider that the customers of some systems can be internal to the organization while others are external. Some of the “customers” may not even be human. For that matter, sometimes the customer’s interface might be a human (user) rather than software. Things get complicated when we begin adding in the social systems:

Generic EITA with Users and customers

The situation is even more complicated than what’s seen above. We need to account for the team developing and operating the automated system:

Generic System with Users, customers, and IT team

And if that team is not a unified whole, then the picture gets a whole lot more interesting:

Generic System with Users, customers, and IT teams

Zoomed out to the enterprise level, that’s a lot of social systems. When multiplied by the number of automated systems involved, the number easily becomes staggering. What’s even more sobering is reflecting on whether those interactions have been intentionally structured or have grown organically over time. The interrelationship of social and software systems is under-appreciated. A series of tweets from Gregory Brown last week makes the same case:

A number of questions come to mind:

  • Is anyone aware of all the systems (social and software) in play?
  • Is anyone aware of all the interactions between these systems?
  • Are the relationships and interactions a result of intentional design or have they “just happened”?
  • Are you comfortable with the answers to the first three questions above?

Organizations as Systems and Innovation

Portrait of Gustavus Adolphus of Sweden

Over the last year or so, the concept of looking at organizations as systems has been a major theme for me. Enterprises, organizations and their ecosystems (context) are social systems composed of a fractal set of social and software systems. As such, enterprises have an architecture.

Another long-term theme for this site has been my conversation with Greger Wikstrand regarding innovation. This post is the thirty-fifth entry in that series.

So where do these two intersect? And why is there a picture of a Swedish king from four-hundred years ago up there?

Innovation, by its very nature (“…significant positive change”), does not happen in a vacuum. Greger’s last post, “Innovation arenas and outsourcing”, illustrates one aspect of this. Shepherding ideas into innovations is a deliberate activity requiring structural support. Being intentional doesn’t turn bad ideas into innovations, but lack of a system can cause an otherwise good idea to wither on the vine.

Another intersection, the one I’m focusing on here, can be found in the nature of innovation itself. It’s common to think of technological innovation, but innovation can also be found in changes to organizational structure and processes (e.g. Henry Ford and the assembly line). Organization, process, and technology are not only areas for innovation, but when coupled with people, form the primary elements of an enterprise architecture. It should be clear that the more these elements are intentionally coordinated towards a specific goal, the more cohesive the effort should be.

This brings us to Gustavus Adolphus of Sweden. In his twenty years on the throne, he converted Sweden into a major power in Europe. Militarily, he upended the European status quo in a very short time (after intervening in the Thirty Years’ War in 1630, he was killed in battle in 1632) by marshaling organizational, procedural, technological innovations:

The Swedish army stood apart from its’ contemporaries through five characteristics. Its’ soldiers wore uniform and had a nucleus of native Swedes, raised from a surprisingly diplomatic system of conscription, at its’ core. The Swedish regiments were small in comparison to their opponents and were lightly equipped for speed. Each regiment had its’ own light and mobile field artillery guns called ‘leathern guns’ that were easy to handle and could be easily manoeuvred to meet sudden changes on the battlefield. The muskets carried by these soldiers were of a type superior to that in general use and allowed for much faster rates of fire. Swedish cavalry, instead of galloping up to the enemy, discharging their pistols and then turning around and galloping back to reload, ruthlessly charged with close quarter weapons once their initial shot had been expended. By analysing this paradigm it becomes apparent that the army under Gustavus emphasized speed and manoeuvrability above all – this greatly set him apart from his opponents.

By themselves, none of the innovations were original to Gustavus. Combining them together, however, was and European military practice was irrevocably changed. Inflection points can be dependent on multiple technologies catching up with one another (since the future is “…not very evenly distributed”), but in this case the pieces were all in place. The catalyst was someone with the vision to combine them, not random chance.

Emergence will be a factor in any complex system. That being said, the inevitability of those emergent events does not invalidate intentional design and planning. If anything, design and planning is more necessary to deal with the mundane, foreseeable things in order to leave more cognitive capacity to deal with that which can’t be foreseen.

Learning Organizations: When Wrens Take Down Wolfpacks

A Women's Royal Naval Service plotter at work in the Operations Room at Derby House in Liverpool, the headquarters of the Commander-in-Chief Western Approaches, September 1944.

What does the World War II naval campaign known as the Battle of the Atlantic have to do with learning and innovation?

Quite a lot, as it turns out. Early in the war, Britain found itself in a precarious position. While being an island nation provided defensive advantages, it also came with logistical challenges. Food, armaments, and other vital supplies as well as reinforcements had to come to it by sea. The shipping lanes were heavily threatened, primarily by the German u-boat (submarine) fleet. Needing more than a million tons of imports per week, maintaining the flow of goods was a matter of survival.

Businesses may not have to worry about literal torpedoes severing their lifelines, but they are at risk due to a number of factors. Whether its changing technology or tastes, competitive pressures, or even criminal activity, organizations cannot afford to sit idle. In his post “Heraclitus was wrong about innovation”, Greger Wikstrand talked about the mismatch between the speed of change (high) and rate of innovation (not fast enough). This is a recurrent theme in our ongoing discussion of innovation (we’ve been trading posts on the subject for over a year now).

The British response to the threat involved many facets, but an article I saw yesterday about one response in particular struck a chord. “The Wargaming “Wrens” of the Western Approaches Tactical Unit” told the story of a group of officers and ratings of the Women’s Royal Naval Service (nicknamed “Wrens”) who, under the command of a naval officer, Captain Gilbert Roberts, revolutionized British anti-submarine warfare (ASW). Their mandate was to “…explore and evaluate new tactics and then to pass them on to escort captains in a dedicated ASW course”.

Using simulation (wargaming) to develop and improve tactics was an unorthodox proposition, particularly in the eyes of Admiral Percy Noble, who was responsible for Britain’s shipping lifeline. However, Admiral Noble was capable of appreciating the value of unorthodox methods:

A sceptical Sir Percy Noble arrived with his staff the next day and watched as the team worked through a series of attacks on convoy HG.76. As Roberts described the logic behind their assumptions about the tactics being used by the U-Boats and demonstrated the counter move, one that Wren Officer Laidlaw had mischievously named Raspberry, Sir Percy changed his view of the unit. From now on the WATU would be regular visitors to the Operations Room and all escort officers were expected to attend the course.

Each of the courses looked at ASW and surface attacks on a convoy and the students were encouraged to take part in the wargames that evaluated potential new tactics. Raspbery was soon followed by Strawberry, Goosebery and Pineapple and as the RN went over to the offensive, the tactical priority shifted to hunting and killing U Boats. Roberts continued as Director of WATU but was also appointed as Assistant Chief of Staff Intelligence at Western Approaches Command.

This type of learning culture, such as I described in “Learning to Deal with the Inevitable”, was key to winning the naval war. Clinging to tradition would have led to a fatal inertia.

One aspect of the WATU approach that I find particularly interesting is the use of simulation to limit risk during learning. Experiments involving real ships cost real lives when they don’t pan out. Simulation (assuming sufficient validity of the theoretical underpinnings of the model used) is a technique that can be used to explore more without sending costs through the roof.

I fought the law (of unintended consequences) and the law won

Sometimes, what seemed to be a really good idea just doesn’t turn out that way in the end.

In my opinion, a lack of a systems approach to problem solving makes that type of outcome much more likely. Simplistic responses to issues that fail to deal with problems holistically can backfire. Such ill-considered solutions not only fail to solve the original problem, but often set up perverse incentives that can lead to new problems.

An article on the Daily WTF last week, “Just the fax, Ma’am”, illustrates this perfectly. In the article, an inflexible and time-consuming database change process (layered on top of the standard change management process) leads to the “reuse” of an existing, but obsolete field in the database. Using a field labeled “Fax” for an entirely different purpose is far from “best practice”, but following the rules would lead to being seen as responsible for delaying a release. This is an example of a moral hazard, such as Tom Cagley discussed in his post “Some Moral Hazards In Software Development”. Where the cost of taking a risk is not borne by the party deciding whether to take it, potential for abuse abounds. This risk becomes particularly likely when the person taking shortcuts can claim a “moral” rationale for doing so (such as “getting it done” for the customer).

None of this is to suggest that change management isn’t a worthy goal. In fact, the worthier the goal, the greater the danger of creating an unintended consequence because it’s so easy to conflate argument over means with disagreement regarding the ends. If you’re not in favor of being strip-searched on arrival and departure from work, that doesn’t mean you’re anti-security. Nonetheless, the danger of that accusation being made will likely resonate for many. When the worthiness of the goal forestalls, or even just hinders, examination of the effectiveness of methods, then that effectiveness is likely to suffer.

Over the course of 2016, I’ve published twenty-two posts, counting this one, with the category Organizations as Systems. The fact that social systems are less deterministic than software systems only reinforces the need for intentional design. When foreseeable abuses are not accounted for, their incidence becomes more likely. Whether the abuse results from personal pettiness, doctrinal disagreements, or even just clumsy design like the change management process described above is irrelevant. In all of those cases, the problem is the same, decreased respect for institutional norms. Studies have found that “…corruption corrupts”:

Gächter has long been interested in honesty and how it manifests around the world. In 2008, he showed that students from 16 cities, from Riyadh to Boston, varied in how likely they were to punish cheaters in their midst, and how likely those cheaters were to then retaliate against their castigators. Both qualities were related to the values of the respective cities. Gächter found that the students were more likely to tolerate free-loaders and retaliate against do-gooders if they came from places whose citizens took a more relaxed view on tax evasion or fare-dodging, or had less trust in their courts and police.

If opinions around corruption and rule of law can affect people’s reactions to dishonesty, Gächter reasoned that they surely affect how honest people are themselves. If celebrities cheat, politicians rig elections, and business leaders engage in nepotism, surely common citizens would feel more justified in cutting corners themselves.

Taking a relaxed attitude toward the design of a social system can result in its constituents taking a relaxed attitude toward those aspects of the system that are inconvenient to them.

Strategic Tunnel Vision

Mouth of a Tunnel

 

Change and innovation are topics that have been prominent on this blog over the last year. In fact, Greger Wikstrand and I have traded a total of twenty-six posts (twenty-seven counting this one) on the subject.

Greger’s last post, “Successful digitization requires focus on the entire customer experience – not just a neat app” (it’s in Swedish, but it translates well to English), discussed the critical nature of customer experience to digital innovation. According to Greger, without taking customer experience into account:

One can make the world’s best app without getting more, more satisfied and profitable customers. It’s like trying to make a boring games more exciting by spraying gold paint on the playing pieces.

Change and innovation are not the same thing. Change is inevitable, innovation is not (with a h/t to Tom Cagley for that quote). As Greger pointed out in his latest article, to get improved customer experience, you need depth. Sprinkling digital fairy dust over something is not likely result in innovation. New and different can be really great, but new and different solely for the sake of new and different doesn’t win the prize. Context is critical.

If you’ve read more than a couple of my posts, you’ve probably realized that among my rather varied interests, history is a major one. I lean heavily on military history in particular when discussing innovation. This post won’t break with that tradition.

The blog Defense in Depth, operated by the Defence Studies Department, King’s College London, has published two posts this week dealing with the Suez Crisis of 1956, primarily in terms of the Anglo-French forces. One deals with the land operations and the other with naval operations. They struck a chord because they both illustrated how an overreaction to change can have drastic consequences from the strategic level down to the tactical.

Buying into a fad can be extremely expensive.

The advent of the nuclear age at the end of World War II dramatically transformed military and political thought. The atomic bomb was the ultimate game-changer in that respect. In the time-honored tradition, the response was over-reaction. “Atomic” was the “digital” of the late 40s into the 60s. They even developed a recoilless gun that could launch a 50 pound nuclear warhead 1.25-2.5 miles. “Move fast and break things” was serious business back in the day.

This extreme focus on what had changed, however, led to a rather common problem, tunnel vision. Nuclear capability became such an overarching consideration that other capabilities were neglected. Due to this neglect of more conventional capabilities, the UK’s forces were seriously hampered in their ability to perform their mission effectively. Misguided thinking at the strategic level affected operations all the way down to the lowest tactical formations.

It’s easy to imagine present-day IT scenarios that fall prey to the same issues. A cloud or digital initiative given top priority without regard to maintaining necessary capabilities could easily wind up failing in a costly manner and impairing the existing capability. It’s important to understand that time, money, and attention are finite resources. Adding capability requires increasing the resources available for it, either through adding new resources or freeing up existing ones by reducing the commitment to less important capabilities. If there is no real appreciation of what capabilities exist and what the relative value of each is, making this decision becomes a shot in the dark.

Situational awareness across all levels is required. To be effective, that awareness must integrate changes to the context while not losing sight of what already was. Otherwise, to use a metaphor from my high school football days, you risk acting like a “blind dog in a meat-packing plant”.

Organizations as Systems – “Uneasy Lies the Head that Wears the Crown”

Bavarian Crown and Regalia, Royal Treasury Munich

 

One of the benefits of having a (very) wide range of interests is that every so often a flash of insight gets dropped into my lap. In this case, it was a matter of “We must recognise that single events have multiple causes” showing up as a suggested read from Aeon on the same day that Thomas Power retweeted this:

The image in the tweet is an excerpt from an interview with Rory Stewart, Conservative Member of Parliament for Penrith in the UK. The collision of themes between the two articles struck me.

“You get there and you pull the lever, and nothing happens.”

The behavior of a system is determined not by the structure of the components of that system, but by the relationships and interactions between those components. Moreover, those relationships and interactions are dynamic and complex, even when that’s contrary to the designer’s intent. In fact, the gap between the behavior as intended and as experienced introduces a tension. I would argue that it’s less a matter of nothing happening when the “lever” is pulled and more that something different from what’s expected happens. Rather than simple cause and effect, “if this, then that”, multiple factors are in play.

In mechanical systems, parts wear, subtly changing the physics of the mechanism. Foreign objects invading the system can impose change in a more dramatic fashion. Context, both that of the system’s internals and its environment, influences its operation.

As was noted in the Aeon article, agency adds to the complexity. In social systems, all of the “components” are individuals with agency, making those systems chaotic in at least the colloquial sense of the word. Using Tom Graves’ sense-making framework, SCAN, these interactions fall into the more uncertain quadrants, either “Ambiguous but Actionable” or “Not-known, None-of-the-above”. Attempting to deal with them as though they fell into the “Simple and Straightforward” quadrant increases the likelihood of getting unexpected results.

Learning/sense-making is critical to dealing with change, whether internal or external (or both). The manner in which change is appreciated and reacted to, affects the health of the system. Consider three boilers: one where pressure is continuously monitored and adjusted, one which is equipped with a pressure relief valve which will open prior to a catastrophic failure, and one where problems are signaled by an explosion. It’s a trivial exercise to come up with examples of social systems, from businesses all the way up to political systems, using the third method. It’s probably a more interesting exercise to consider why that’s the case for so many.

In a recent post, “Architecting the shadows”, Tom Graves discussed the phenomenon of ad hoc, unofficial “shadow” organizational interactions that arise in order to get work done:

In SCAN terms, we could summarise the generic positioning of all ‘shadow’ functions – shadow-IT, shadow-business-models, shadow-management and more – much as follows:

Scan Diagram: Official vs. Shadow

In other words, the ‘shadow’-world exists to deal with and resolve all the uncertainties and over-simplifications that overly-mechanistic management models tend to overlook. Even in more aware management-models, in which some exploration of the uncertain is officially sanctioned and allowed, the shadow-world will still always need to exist – particularly whenever the work gets closer towards real-time action:

Scan Diagram: Official vs. Shadow showing sanctioned Shadow Activity

In closing the post, Tom makes the following observation:

As the literal ‘the architecture of the enterprise’, a real enterprise-architecture must, by definition, cover every aspect of the enterprise – including all of the ‘shadow’-elements. And yet, also by definition, those ‘shadow’-elements cannot be brought ‘under control’ – not least because they deal with the themes and factors that are beyond the reach of conventional concepts of ‘control’.

The “conventional concepts of ‘control'”, the deluded belief that complex interactions can be managed as though they were simple, poses an immense risks to organizations. Even attempting to treat those interactions as merely complicated, rather than complex, introduces a gap between reality and perception, between “the way we do things” and the way things actually get done. When the concept and reality of the system’s interactions differ, it’s more likely that the components of the system will wind up working at cross-purposes.

In a comment on Tom’s post, I noted that where the shadow elements are a “French Resistance”, flouting the rules in order to actually get work done, that’s a red flag.

The most important thing to learn about management and governance is knowing when and how to manage or govern and more importantly, when not to. Knowing what can actually be controlled is an important first step.

When Will We Learn?

Plato's Academy mosaic from Pompeii

We’ve all heard the sayings about history repeating. Did we pay attention? Did we actually hear what was said, or were we just in the room when it was mentioned? Did we learn anything?

Greger Wikstrand and I have been trading posts on innovation for more than seven months. His last post, “Black hat innovation”, touched on the dark side of innovation:

I think the following are good examples of black hat innovation in the digital space: credit card fraud, ransomware and identity theft. There are many other black hat innovations that does not rely on tech such as chain letters, counterfeit money and even weighted dice.

Greger noted “Sometimes, it might seem as if black hat innovation out paces white hat innovation.” Certainly, a black hat innovator faces fewer barriers to innovation. Following rules is less of a consideration when breaking rules. We also have to be aware of the advantage we give them when we fail to learn from the past. None of the abuse cases that Greger mentioned are black swans. They’re merely new ways to commit old crimes. Waiting to react to a foreseeable issue means we start from behind because we failed to learn.

Failure to learn can manifest in a variety of ways. In his post, “Enterprise challenges”, Peter Murchland noted:

All enterprises face challenges (of varying magnitude and complexity). These challenges are either problems to be solved or opportunities to be pursued. One way of considering these challenges is through the following three lenses – those arising due to:

  • external change
  • internal change
  • growth and development

Peter asserts that the health of an enterprise depends on integrating coherent responses to these challenges into its architecture. This is a position shared by Aaron Dignan in “How To Eliminate Organizational Debt”:

Organizational Debt: The interest companies pay when their structure and policies stay fixed and/or accumulate as the world changes.

Let’s unpack that. As time passes, companies create roles, structures, rules, policies, and other norms that become fixed, and often, difficult to change. This is by design. For example, a company’s travel budget may balloon one year, only to be restricted by a travel policy the next — a well intentioned control designed to reduce expense. If that policy starts costing more than it’s saving (e.g. by reducing commercial success due to a lack of face time, frustrating top talent, etc.), it becomes an unacknowledged debt. The “interest” comes in the form of reduced speed, capacity, engagement, flexibility, and innovation that ultimately undermine the macro objectives of the firm: to survive, thrive, and achieve its purpose.

In other words, failure to learn leads to failure to adapt to a changing context, internal and/or external. This mismatch between the enterprise and its context represents a destructive friction. Since an enterprise is a social system and the components of a social system are people, this means that the effects of this friction erodes morale. Disengaged, demotivated employees will hinder an organization’s ability to deliver effectively. The only question is how much of a hindrance it will be.

In my post “Learning to Deal with the Inevitable”, I talked about the value of a learning culture for an organization. Effective execution requires effective decision-making, which requires learning. If we haven’t learned from the past, then there’s no rational basis for our future actions. We’re winging it solely on the basis of hope.

An organization is unlikely to develop a learning culture by accident. Just as a tornado hitting an auto parts store is unlikely to spit out a sports car, it’s unlikely that a system of sensing and adjusting to an ever-changing environment will emerge without intentional action. The social system that is the enterprise has a design and requires design, much the way an automated system has a design and requires design. Part of that design should incorporate learning.

Different organizations will have different needs and different styles of learning. Getting groups of people together to play with technology (as described by Matt Ballantine in his post “The Art of Play”) may not work for every organization. However, it is shortsighted to make no provision for learning whatsoever. Even the British Army in World War I, a conservative institution with an aristocratic officer corps in a conflict that can be described as 19th century warfare with 20th century weapons, “…developed a number of different methods to disseminate knowledge, catering for a variety of different circumstances and needs”. With a hundred years’ worth of extra experience and technology, it would be hard to justify taking a less rigorous approach.