Gabriel Morgan

Sharing experience as an Enterprise Architect, Business Strategist, Business Performance Manager, Business Architect and Solution Architect. Twitter:@Gabriel_Morgan

The formula for Agility

The formula for Agility

  • Comments 1

Months ago I published a blog about how to implement system quality attributes (found here). In that article I asserted that Solution Architects should deliberately design systems for high quality and I provided a means for doing this. Because the point of software, or so I assert, is to automate business processes and protect data quality with the goal of enabling an Agile business, I'd like to propose a formula for determining System Agility.

So, how can we design systems to optimize System Agility in order to enable our agile business? Perhaps this isn't all that complicated if we use system quality attributes. Here's a stab at a formula for System Agility:

A=2(F)+M+I+T+R-P-S

The aim of this formula is not to pretend that one could use it to calculate System Agility with precision. Rather it is more of an attempt to describe how System Agility could be achieved. Using the formula allows me to describe the relationship of key system quality attributes that optimize and degrade System Agility and leave it to the Solution Architect to focus on the relative system quality attributes and implement systems design to optimize or minimize each of the key system quality attributes included in the formula. 

Before I explain each factor, let me cover the assumptions I've made:

  1. Agile businesses require Agile systems and the definition I use for System Agility is the ability of a system to be both flexible and undergo change rapidly [MIT ESD 2001].
  2. The list of system quality attributes listed at the end of this blog under the label System Quality Attribute Definitions is the finite set of system quality attributes. Yes, I know that this is a false assumption but humor me for a moment.
  3. I don't get into the complications of the natural system quality dependencies. That is, for the sake of this formula being very simple, I don't recognize the dependency tree of system quality attributes and their interrelationships and I let each system quality attribute stand on it's own.
  4. For simplicity of my key messages in this blog, I focus on the software system only and don't address organizational capabilities nor infrastructure needed to properly deliver System Agility to an enterprise.

Let me take a moment and explain each factor in the equation.

(F) System Flexibility Factor. The most important system quality attribute to enable System Agility is System Flexibility. System Flexibility enables systems to be used in different ways and to be modified for these different uses. It is the means for decoupling interactions between actors. Targeting System Agility, System Flexibility allows software to be reused in ways not thought of at the time of development. I published a blog on System Flexibility, found here, that directly addresses ways to optimize System Flexibility at the enterprise-level. For this reason, I've assigned a multiplication factor to System Flexibility to emphasize the importance of System Flexibility to System Agility.

(M) System Maintainability Factor. System Maintainability is very important to agile software. The key here is code that is optimized for System Maintainability is more likely to withstand changes from bugs that are found and fixed or as well as software that undergoes natural improvements. Software characteristics that are directly pertinent to System Maintainability include; Versioning, Re-factored code, Code Complexity, code structure, etc. 

(I) System Interoperability Factor. System Interoperability brings to the table a focus on interoperating between systems, which is arguable one of the most important concerns for enterprise applications system integration. Software characteristics that are directly pertinent to System Interoperability include; Service Operation designed for uniqueness and extensibility, Message Schemas including those that are canonical, software design patterns that optimize for composability to support orchestration and workflow.

(T) System Testability Factor.  System Testability is important because it forces system designers to make deliberate system architecture decisions to ensure that the software that is produced can be easily tested for delivery. Especially as we see software designed from service-oriented software architecture approaches and S+S needs, the ability to design software to enable these types of software needs requires intentional design to optimize testability. Software characteristics that are directly pertinent to System Testability include; abilities to perform Unit Tests, Customer Tests, Stress Test, Exception Tests, Failover Tests, Function Tests, Security Penetration Tests, Performance Test, System Integration Tests, Regression Tests, Code Coverage Tests, etc.

(R) System Reusability Factor. System Reusability is an important system quality attribute involves major architectural styles and patterns like the Service Layer Pattern [Fowler 2003], basic software design principles such as software encapsulation and the familiar SOA Tenets.

(P) System Performance Factor. System Performance often degrades System Agility and for this reason I've asserted that System Performance subtracts from the previously noted system quality attribute factors. Bummer. I wish that we could have our cake and eat it too but the reality is that when optimizing for System Agility, System Performance is a Tradeoff Point. For example, often the fastest systems tend to consolidate large amounts of logic, data and processing instructions onto a single platform to reduce network latency, packaging and unpackaging of inter-process information, data access, etc. I don't want to ignore performance enhancing techniques such as caching and patterns such as Data Transfer Object [Fowler 2003]. These are great ways to lessen the impact that System Performance can place on System Agility. It's just that these add a bit of complexity to the rest of the system and can degrade System Maintainability and System Testability for example and still don't, from a purist perspective, optimize System Performance.

(S) System Security Factor. Like System Performance, System Security often degrades System Agility and for this reason I've asserted that System Security has a negative relationship to the equation. For example, Adding software characteristics such as role-based security into a system at the data layer, application layer, host layer and network layer adds significant degradation to System Performance, System Testability and System Maintainability. I'm not at all suggesting that a Solution Architect should read this as a suggestion to place little System Security design into their system. The trick, as a colleague once told me, is to have just enough security.

System Quality Attribute Definitions

Agility

Agility is the ability of a system to be both flexible and undergo change rapidly. (MIT ESD 2001)

Flexibility

Flexibility is the ease with which a system or component can be modified for use in applications or environments other than those for which it was specifically designed. (Barbacci 1995)

Maintainability

Maintainability is:

· The aptitude of a system to undergo repair and evolution. (Barbacci 2003)

· The ease with which a software system or component can be modified to correct faults, improve performance or other attributes, or adapt to a changed environment. (2) The ease with which a hardware system or component can be retained in, or restored to, a state in which it can perform its required functions. (IEEE Std. 610.12)

Interoperability

Interoperability is the ability of two or more systems or components to exchange information and to use the information that has been exchanged. (IEEE 1990)

Testability

Testability is the degree to which a system or component facilitates the establishment of test criteria and the performance of tests to determine whether those criteria have been met (IEEE 1990).

Reusability

Reusability is the degree to which a software module or other work product can be used in more than one computing program or software system. (IEEE 1990).

This is typically in the form reusing software that is an encapsulated unit of functionality.

Performance

Performance is the responsiveness of the system – the time required to respond to stimuli (events) or the number of events processed in some interval of time. Performance qualities are often expressed by the number of transactions per unit time or by the amount of time it takes to complete a transaction with the system. (Bass 1998)

Security

Security is a measure of the system’s ability to resist unauthorized attempts at usage and denial of service while still providing its services to legitimate users. Security is categorized in terms of the types of threats that might be made to the system. (Bass 1998)

Reliability

Reliability is the ability of the system to keep operating over time. Reliability is usually measured by mean time to failure. (Bass 1998)

Supportability

Supportability is the ease with which a software system is operationally maintained.

Performance

Performance is the responsiveness of the system – the time required to respond to stimuli (events) or the number of events processed in some interval of time. Performance qualities are often expressed by the number of transactions per unit time or by the amount of time it takes to complete a transaction with the system. (Bass 1998)

Security

Security is a measure of the system’s ability to resist unauthorized attempts at usage and denial of service while still providing its services to legitimate users. Security is categorized in terms of the types of threats that might be made to the system. (Bass 1998)

Scalability

Scalability is the ability to maintain or improve performance while system demand increases.

Usability

Usability is:

· The measure of a user’s ability to utilize a system effectively. (Clements 2002)

· The ease with which a user can learn to operate, prepare inputs for, and interpret outputs of a system or component. (IEEE Std. 610.12)

· A measure of how well users can take advantage of some system functionality. Usability is different from utility, a measure of whether that functionality does what is needed. (Barbacci 2003)

Sources

· [Bachmann 2000] Bachmann, F.; Bass, L.; Chastek, G.; Donohoe, P. & Peruzzi, F. The Architecture Based Design Method (CMU/SEI-2000-TR-001 ADA375851). Pittsburgh, PA: Software Engineering Institute, Carnegie Mellon University, 2000. Available WWW: <URL: http://www.sei.cmu.edu/publications/documents/00.reports/00tr001.html>.

· [Barbacci 1995] Barbacci, M.; Klien, M.; Longstaff, T; Weinstock, C. Quality Attributes - Technical Report CMU/SEI-95-TR-021 ESC-TR-95-021. Carnegie Mellon Software Engineering Institute, Pittsburgh, PA.

· [Barbacci 2003] Barbacci, M. Software Quality Attributes and Architecture Tradeoffs. Software Engineering Institute, Carnegie Mellon University. Pittsburgh, PA.

· [Bass 1998] Bass, L.; Clements, P.; & Kazman, R. Software Architecture in Practice. Reading, MA; Addison-Wesley.

· [Bass Kazmann 1999] Bass, L.; Clements, P.; & Kazman, R. Architecture-Based Development.

· [Fowler 2003] Martin Fowler. Patterns of Enterprise Application Architecture, Boston, MA. Addison-Wesley.

· [Gamma 1995] Gamma, E.; Helm, R; Johnson, R.; & Vlissides, J. Design Patterns, Elements of Reusable Object-Oriented Software. Addison-Wesley. Carnegie Mellon Software Engineering Institute

· [Hohpe Woolf 2004] Gregor Hohpe and Bobby Woolf. Enterprise Integration Patterns. Boston, MA. Addison-Wesley.

· [IEEE 1990] Institute of Electrical and Electronics Engineers. IEEE Standard Computer Dictionary: A Compilation of IEEE Standard Computer Glossaries. New York, NY.

· [IEEE 1992] IEEE Std 1061-1992: IEEE Standard for a Software Quality Metrics Methodology. Los Alamitos, CA: IEEE Computer Society Press.

· [Kazman 2000] Kazman, R.; Klein, M. & Clements, P. ATAM: Method for Architecture Evaluation CMU/SEI-2000-TR-004 ADA382629. Pittsburgh, PA: Software Engineering Institute, Carnegie Mellon University. Available WWW: <URL: http://www.sei.cmu.edu/publications/documents/00.reports/00tr004.html>

· [MIT ESD 2001] Tom Allen, Don McGowan, Joel Moses, Chris Magee, Dan Hastings, Fred Moavenzadeh, Seth Lloyd, Debbie Nightingale, John Little, Dan Roos, Dan Whitney. ESD Terms and Definitions (Version 12); Massachusetts Institute of Technology. Engineering Systems Division. ESD-WP-2002-01, October.

Leave a Comment
  • Please add 5 and 2 and type the answer here:
  • Post