Enterprise Architecture and the Business of IT

Turning Gears Animation

I’ve been following Tom Graves and his Tetradian blog for quite a while. His view of Enterprise Architecture (EA), namely that it is about the architecture of the enterprise and not just the enterprise’s IT systems, is one I find compelling. With some encouragement on Tom’s part, I’ve begun touching on the topic of EA, although in a limited manner. When it comes to enterprise architecture, particularly according to Tom’s definition, I consider myself more of a student than anything else. I design software systems and systems of systems, not the enterprises that make use of them.

However…

I’m finding myself drawn to the topic more and more these days. I’m finding it more and more relevant to my work. The fractal nature of social systems using software systems is a major theme of the category “Organizations as Systems” on this site. If the parts fit poorly, then the operation of the system they comprise will be impeded. A good way to ensure the parts fit poorly is to fail to understand the context in which they will inhabit. So while I may not be designing the social system which my systems fit into, an understanding of how that system functions is invaluable, as is an understanding of how my social system (IT) interacts with my client’s social system (the rest of the business) to further the aims of the enterprise.

Tom’s latest post, “Engaging stakeholders in health-care IT”, is an excellent example of how not to do things. In the post, Tom discuss attending a conference on IT in healthcare where the players had no real knowledge about healthcare. They couldn’t even identify the British Medical Journal or the Journal of the American Medical Association, much less have an idea about what issues might be found in the pages of those publications. They didn’t feel that was a problem:

Well, they didn’t quite laugh at me to my face about that, but in effect it was pretty close – a scornful dismissal, at best. In other words, about the literally life-and-death field for which they’d now proclaimed themselves to be the new standard-bearers, they were not only clueless, but consciously clueless, even intentionally clueless – and seemingly proud of it, as well. Ouch… At that point the Tarot character of ‘The Fool’ kinda came to mind – too self-absorbed to notice that he’s walking straight over a cliff

While I recently advocated embracing ignorance, it was in the context of avoiding assumption. The architect will know less about the business than a subject matter expert who will most likely know less than the end user on the pointy end of the spear. The idea is not to remain ignorant of the domain, but to avoid relying on our own understanding and seek better sources of information. It’s unreasonable to think we can design a solution that’s fit for purpose without an understanding of the architecture of the problem, and make no mistake, providing solutions that are fit for purpose is the business of IT.

Ignorance Isn’t Bliss, Just Good Tactics

Donkey

There’s an old saying about what happens when you assume.

The fast lane to asininity seems to run through the land of hubris. Anshu Sharma’s Tech Crunch article, “Why Big Companies Keep Failing: The Stack Fallacy”, illustrates this:

Stack fallacy has caused many companies to attempt to capture new markets and fail spectacularly. When you see a database company thinking apps are easy, or a VM company thinking big data is easy  — they are suffering from stack fallacy.

Stack fallacy is the mistaken belief that it is trivial to build the layer above yours.

Why do people fall prey to this fallacy?

The stack fallacy is a result of human nature  — we (over) value what we know. In real terms, imagine you work for a large database company  and the CEO asks , “Can we compete with Intel or SAP?” Very few people will imagine they can build a computer chip just because they can build relational database software, but because of our familiarity with building blocks of the layer up,  it is easy to believe you can build the ERP app. After all, we know tables and workflows.

The bottleneck for success often is not knowledge of the tools, but lack of understanding of the customer needs. Database engineers know almost nothing about what supply chain software customers want or need.

This kind of assumption can cost an ISV a significant amount of money and a lot of good will on the part of the customer(s) they attempt to disrupt. Assumptions about the needs of the customer (rather than the customer’s customer) can be even more expensive. The smaller your pool of customers, the more damage that’s likely to result. Absent a captive customer circumstance, incorrect assumptions in the world of bespoke software can be particularly costly (even if only in terms of good will). Even comprehensive requirements are of little benefit without the knowledge necessary to interpret them:

But, that being said:

This would seem to pose a dichotomy: domain knowledge as both something vital and an impediment. In reality, there’s no contradiction. As the old saying goes, “a little knowledge is a dangerous thing”. When we couple that with another cliche, “familiarity breeds contempt”, we wind up with Sharma’s stack fallacy, or as xkcd expressed it:

'Purity' on xkcd.com

In order to create and evolve effective systems, we obviously have a need for domain knowledge. We also have a need to understand that what we possess is not domain knowledge per se, but domain knowledge filtered through (and likely adulterated by) our own experiences and biases. Without that understanding, we risk what Richard Martin described in “The myopia of expertise”:

In the world of hyperspecialism, there is always a danger that we get stuck in the furrows we have ploughed. Digging ever deeper, we fail to pause to scan the skies or peer over the ridge of the trench. We lose context, forgetting the overall geography of the field in which we stand. Our connection to the surrounding region therefore breaks down. We construct our own localised, closed system. Until entropy inevitably has its way. Our system then fails, our specialism suddenly rendered redundant. The expertise we valued so highly has served to narrow and shorten our vision. It has blinded us to potential and opportunity.

The Clean Room pattern on CivicPatterns.org puts it this way:

Most people hate dealing with bureaucracies. You have to jump through lots of seemingly pointless hoops, just for the sake of the system. But the more you’re exposed to it, the more sense it starts to make, and the harder it is to see things through a beginner’s eyes.

So, how do we get those beginner’s eyes? Or, at least, how do we get closer to having a beginner’s eyes?

The first step is to reject the notion of our own understanding of the problem space. Lacking innate understanding, we must then do the hard work of determining what the architecture of the problem, our context, is. As Paul Preiss noted, this doesn’t happen at a desk:

Architecture happens in the field, the operating room, the sales floor. Architecture is business technology innovation turned to strategy and then executed in reality. Architecture is reducing the time it takes to produce a barrel of oil, decreasing mortality rates in the hospital, increasing product margin.

Being willing to ask “dumb” questions is just as important. Perception without validation may be just an assumption. Seeing isn’t believing. Seeing and validating what you’ve seen, is believing.

It’s equally important to understand that validating our assumptions goes beyond just asking for requirements. Stakeholders can be subject to biases and myopic viewpoints as well. It’s true that Henry Ford’s customers would probably have asked for faster horses, it’s also true that, in a way, that’s exactly what he delivered.

We earn our money best when we learn what’s needed and synthesize those needs into an effective solution. That learning is dependent on communication unimpeded by our pride or prejudice: