A question on LinkedIn recently reminded me that, as the team leader for Segment Architecture in my former EA team, I was accountable for identifying a core set of deliverables for the team. The idea was that we could focus on defining standard formats and contents for these deliverables and, in doing so, we could start to measure both our output and our quality.
We only created pre-canned templates for a few of them. This is partly because the team was not mature enough in its practices to get consistency, and partly because Enterprise Architecture itself is not mature, or accepted, enough to have stakeholders that would notice if our deliverables meet an objective standard. Also this list is not intended to be comprehensive. The goal was to describe deliverables where it may possibly make sense to go for some level of consistency. Any EA could (and often did) create deliverables that were not on the list.
Perhaps it is time to share what we came up with.
Note that this list is the result of a single team doing its work, and is not representative of any “standards” effort across other EA groups. That said, I stand beside this list. I think it is a useful start. Note that many are technical in nature. I did not, in making this list, differentiate between BA and EITA deliverables. So if you are someone who believes that EA = BA + EITA, then you will see both sets of deliverables, intermixed, in the list below. If you are someone offended by the inclusion of technical architecture deliverables in an EA list… tough. I was working with reality.
Why create it
Architectural Point of View (or Technical Policy)
Provide clear input to Business or IT leadership on issues relevant to Enterprise Architecture
Short document describing a problem that requires attention and the opinion of EA for solving it.
Architectural Reference Model (or Architectural Pattern)
Provide clear input to IT project SMEs on optimal or preferable design options
Short document describing a set of concerns and a proven approach for addressing them
<Segment> Current State Model and Analysis
To demonstrate and communicate challenges inherent in current processes / systems / information
A collection of architectural models, including a context model, process models, and information models, as understood to currently exist , plus an analysis of issues and risks
<segment> Future State Vision and Model
To demonstrate the design of the future processes / systems / information needed by strategic intent.
A collection of architectural models that reflect a specific set of engineered changes
Governance Model and Analysis
Clarify roles and responsibilities and decision making processes for planning and oversight of initiatives
Process model, description of roles and responsibilities, and description of deliverables needed for planning, oversight and governance, along with implementation ROI and plans
M&A Business Case & Analysis
To provide a rationale for the acquisition of a company for the purpose of improving operational effectiveness. (M&A)
The document contains rationale including Competitive Analysis, SWOT / Twos analysis, and Strategic Alternatives Analysis
System Integration Recommendations Document
To set a vision for how key processes and systems shall be integrated into enterprise infrastructure (primarily M&A)
End to End business scenarios, Process and System Integration points, Risks and Issues for each integration concern, and an analysis of alternatives and recommendations
Value chain and operating model analysis
To clearly address gaps and strategic requirements for integrating or divesting a set of processes and/or systems (primarily M&A)
Target value chain and operating model for post-M&A future state. Mappings of key processes to or from the enterprise core diagram, and analysis of changes with the intent of composing key initiatives.
Enterprise Core Diagram
To clearly declare the processes and systems that are NOT core to the operations of the enterprise
A list of systems and processes mapped grouped into “ecosystems” that are clearly indicated as “core” and “edge” with analysis of governance
EARB Engagement Package
To demonstrate project level architectural quality to the EA Review Board
A pre-defined collection of project architectural models and artifacts.
Capability Model and Assessment
Provide clear basis for data collection for a segment
List of capabilities for a segment with assessment of capability maturity, etc.
Capability Gap Analysis
Highlight underperforming capabilities to focus investment
Map of capabilities needed by strategies, highlighting those needed investment, and listing relative and absolute program spend against each
<segment> Roadmap (a.k.a. Transition Plan)
To clarify the scope, timing, and dependencies between initiatives needed to deliver on a strategy
List of proposed initiatives and dependencies between them to deliver on strategic intent
Strategy Map and/or Balanced Scorecard
To clarify the strategies, goals, and objectives of a segment and allow for measurement and alignment
Categorized strategies, measures, and metrics for a specific timeframe and business scope
<segment> Process Model and Analysis
To clarify and build consensus on the business processes (as-is or to-be), and as input to process improvement / measurement
Models of processes, activities, information assets and system interaction points , and an analysis of opportunities to improve.
Enterprise Scenario and Analysis
To get clarity on the experience of a key stakeholder (often a customer or partner)
Textual and diagrammatic description of an experience, often with analysis to indicate opportunities
<segment> Information Model and Analysis
To improve understanding of requirements and the rationalization of design
Well-constructed information model, at one or more well-defined levels of abstraction, covering all aspects of a segment, aligned with EDM, along with an analysis of risks and issues
Capture ability of an app or platform to meet strategic needs
Collection of measurements, attributes, and mappings to an app or platform
Proof of Concept (POC) delivery
To create a design that demonstrates, and proves, an approach for solving difficult issues
A software deliverable and an architectural reference model (see above)
Record of Architectural Tradeoffs
To clearly communicate the tradeoffs made by architects on the customer’s behalf
Textual description of architectural decisions and the implications for the owner of the process / tool
If you would like to follow the TOGAF framework, what would you modify in this list of deliverables.
Very useful list, Nick - especially with the emphasis on the 'Why' for each of the deliverables. Many thanks for this!
@Nick: "If you are someone offended by the inclusion of technical architecture deliverables in an EA list… tough. I was working with reality."
I strongly agree and accord with the "I was working with reality" comment. Yet does anyone these days think that technical-architecture deliverables should _not_ belong in an EA list? - especially for a tech-oriented company? That would be the building-architects equivalent of saying that wiring or plumbing specifications should not be part of that architecture - which would be daft...
(I know some people might accuse me of doing so in some of my work, but all I've ever done is assert that EA should not be _solely_ about IT - which is very different from saying that EA should not include IT!)
There are a couple of deliverables in this list that are "combined" into a single deliverable in TOGAF (the Architectural Definition Document). Specifically, we deliver these parts separately because our process didn't match the assumed business processes of TOGAF, which means that the timing of these deliverables drove VASTLY different timings for these elements. Assuming that they would be delivered in a single package is naïve.
In addition, there are a wide list of deliverables from TOGAF that are not listed here. TOGAF assumes that there will be a formal "request for architectural services," which is a model that I have never seen work in practice. As a result, many of the elements of the ADM are described in details that are simply irrelevant.
This is not to mean that the TOGAF cannot be implemented effectively. I believe that it can. I simply have not seen it.
Also, the list of deliverables above are more detailed for those areas where a specific element can be leveraged as a kind of independent opinion and can therefore be delivered, validated, used, and managed outside of an overall process.
Thanks for your kind words.
I think that there are many "kinds" of people who participate in discussions of EA. People who actually practice EA, people who think about (and teach about) EA, and people who are outside of EA and want to define the boundaries between themselves and EA.
The first group ALWAYS includes technology deliverables.
The second group argues interminably about whether technology deliverables should be included.
The third group often uses the existence of technical deliverables as a weapon to "prove" the position that they want to prove.
Clearly, I'm aiming this entry at actual enterprise architects. The rest are interesting and important, but this post is not going to answer their questions.
Interesting that so many of these artefacts are "documents" or some other sort of static artefact (and even the models in this list are not usually integrated with one another). It means that there is a huge overhead in keeping these things aligned and kept up to date, so that in practice, enterprise architecture becomes a massive content management project (hence the popularity of tools likes Orbus iServer).
My experience is that enterprise architecture is and always will be, swamped in administrative catch-up until we are able to construct holistic, dynamic, enduring business models that are completely integrated and aligned.
A Business Architecture Ontology springs to mind ;-)
(... and no John, the Zachman Framework is NOT an Ontology)
The list was specifically a set of measurable deliverables. The fact that something is delivered does not mean that it suddenly freezes in time. It just means that it has reached a point of maturity sufficient to be useful in a context.
In our case, the delivery of a model was expected to be done by extracting the model from our EA tool. They were expected to be integrated models that evolve over time.
For content management, we would collect deliverables in SharePoint as evidence of valuable activity, but not to demonstrate that we had a valuable cache of information. That was the model, not the sharepoint site. The model was maintained in our EA tool, not sharepoint or visio.
Perhaps the ontology you are thinking of can be the EBMM? (www.motivationmodel.com)