I been having conversations like this a lot recently. I work amongst a lot of IT people who have specialized themselves in developing or deploying custom solutions, be those software development solutions or infrastructure deployment solutions. They could just as well be developing them (as in: development of a software product/solution), as well as deploying them (as in: deploying say Active Directory or a Lync Solution). The issues are the same.
Often, the conversations get around to the struggles with how to be more efficient and consistent across the many projects they have to deliver as they learn more and more about the solution implementation details, the technology intricacies and the customer’s needs that guide the solution implementations. Ultimately, by serving multiple customers with the same kinds of solutions, sooner or later supportability, maintainability and consistency of the solution become pretty damn important quality attributes. Never mind that from a business perspective, being able to reproduce proven, supportable and maintainable solutions over and over again consistently is very lucrative for custom delivery margins.
After a while in the game of custom solution delivery, and for whatever motivations that can be discussed later, the experts start to invest in repeatability and automation and building custom tools of one sort or another. Be those tools in scripts, batch files, utilities, custom web sites, etc.
Not many get far down that track though, and not many custom tools that are built are maintainable and supportable enough for continuing reuse. Why is that? The payoff in consistency and productivity with automation is indisputable.
The primary reason for the lack of prevalence of maintainable custom automation tools in custom solution development/deployment is that building custom automation is technically difficult because the available automation frameworks are still very rudimentary.
And using them for solution repeatability often leads the solution domain experts off track and down deep rat holes in learning the intricacies and constraints of these automation technologies. More importantly however, most experts just don’t have the time to do that kind of experimentation, and still remain on top of optimizing and evolving their solutions at the same time.
Inevitably, most of the automation does not get built, and the fallback is manual labor, with consistency and efficiency being jeopardized in repeatability of future implementations. Many of the (expensive) innovations and technology advances made in these implementations are lost over time, simply because only a few can possibly carry the intellectual know-how, details, experiences and manifestations of them forward in their own heads. The mistakes made, and learned from in optimizing the solution implementations are certainly bound to be repeated over and over again when the solution is repeated again by others. Its good business for consultancy businesses of course, but it is very stifling for the advancement of the industry, and expensive and frustrating for the customers of the solutions.
Like most scientific disciplines, documentation has become the prevalent media for communicating learning's for advancement and promotion of knowledge moving forward. Word processing software is readily available to everyone today. But this form of media lacks the ability to describe well enough the detailed context of the solution, the technologies used and assumptions built upon, for other practitioners to pick up, consume and reapply successfully.
There are all too often far too many technical details to document in a custom solution. A typical starting point is to document a baseline solution, one that can be well-known and one that can be reproduced consistently and easily, and that will be presumed as the base platform upon which the custom solution is built. Upon that, there are a a myriad of details, assumptions, architectural choices, technology selections and custom requirements that are implemented on the baseline solution, and the fine details of those critically need to be captured in detail in this document too. Discussions of how the implementation details could vary based on an unbounded set of different requirements and constraints is extremely tedious and exhausting to capture to written form, and is therefore more than not omitted. And then there are the necessary detailed guidance, and instructional steps (complete with screenshots of the current tools) that are required to be able to reproduce the solution consistently that need to be captured.
All its takes to thwart reuse of the critical knowledge and learning’s from this kind of documented experience is for the next implementation to have one or more slightly different-enough requirements, or technology constraints, or differences (i.e. a different product version) that the engineering team building the new solution will dismiss this entire volume of knowledge and opt to rebuild (and relearn) it from scratch again.
If you have ever tried to build a repository of such intellectual property (IP) provided by solution delivery experts based upon such documented assets like this, you will know from experience that consumption of that IP and practical reuse of that IP is pathetically low by other practitioners. Why is this?
For the reasons stated above, and:
Without any of these things the practitioner must have the experience and knowledge of the domain expert to begin with. Today, with a distinct lack of standards and standardization in the custom software and solution industry, its no wonder delivery practitioners and solution experts alike favor a ‘rebuild’ over ‘reuse’ approach when faced with solution descriptions captured in documentation.
It is not that capturing the intricate details of a custom solution implementation for reuse is practically too hard. After all, every detail, and all learning's and optimizations are already captured and embodied accurately and correctly in the original proven custom solution. It’s just that written documentation which is constrained only to text and illustrations is a very poor form of media to capture and communicate the details required for consistent repeatability of the solution.
A far better form for capturing this kind of information for repeatability, is a templated media form, that can communicate the following:
This form of templated media could not be complete without detailed knowledge of how the solution would be reproduced, or reconfigured from any configuration of the solution. Without this level of detail, it merely exists as high-level conceptual architectural documentation.
In order to be complete, and to provide the ability to reproduce the solution consistently, correctly and rapidly, this kind of media form needs to be able to also capture the following artifacts:
Finally, this form of media would not be complete without the ability of a practitioner to adapt/tailor the templated form to meet a set of unknown requirements or constraints discovered during in their solution delivery.
This templated media form would manifest itself as custom tooling. Custom tooling that can be generated from captured information from a domain expert, with reusable harvested assets and templates, described with a suitable schema , explained with appropriate instructional guidance and driven with efficient automation. This kind of tooling would be packaged as an installable ‘toolkit’ specifically designed to be reused by practitioners to recreate solutions of this type consistently. Having the ability be configured to recreate any number of permutations of the known variability of the solution to implement a set of known requirements and constraints.
Why have you not been able to achieve this before? It is not just that building automation is expensive and time consuming. Automation technologies are rudimentary and difficult to learn. Tool building with these technologies has become a very specific skill, one that takes time to acquire. Domain experts are too busy solving problems in their domain space, and learning these new skills is not an investment many make. But even if you do, there is another more insidious problem lurking. How do you maintain it moving forward? What happens when the next rev of the technology/platform you are implementing on comes along?, or a new version is released and something changes or is improved? How can this custom tooling be adapted?
Have you ever written a script/program, and then gone back to it only 3 weeks/months later to adapt it slightly to apply it to another solution?, and got frustrated because you forgot how it worked in the first place, or that the assumptions you made then didn’t apply now? And what about picking up someone else’s automation scripts and trying to figure out how they work, before embarking on customizing it for your solution? The tools to do all this kind of are just painful to use. They are painful to adapt, because they have no understanding of the solution you are tackling with them, and are not arranged in any meaningful arrangement to the solution you are addressing. The automation they provide is not standardized or contextualized to your specific solution or the components of it. They don’t use the language or vocabulary of your solution. These automation frameworks are far too generic to help you quickly solve problems specific to your solution domain.
Here is an alternative experience, one that you may well see more value investing in. How would it be just to pick up that collection of automation scripts and for your particular solution, make assertions like: I like this part – I’ll keep that. That part - I want to change slightly. This part - I don’t want anymore. And I need to add a few additional parts here that are specific to this implementation of the solution.
What automation framework knows enough about your domain to help you do that quickly and easily?
These are exactly the scenarios and problems that Pattern Automation is now solving today for many domain experts in solution development and deployment.
Instead of writing detailed documentation that tries to describe and establish a context of a specific solution, trying to define the constraints and rules embodied into a specific instance of the solution, or explain the specific assumptions, rationales and architectural concepts that were used in it, and may attempt to identify and document a few cases of variance of the implementation. Wouldn’t it be better to be able to create actionable tools that define and embody all of those things generalized for any implementation of the solution?. And have those tools integrated with and contextualized by the solution being built - automatically?
Sounds good, but how?
The method is actually quite straightforward. You (the domain expert) design and build a custom ‘Pattern Toolkit’ for practitioners. A toolkit that implements a ‘solution design/deployment pattern’.
You first define the solution in terms of its basic architectural concepts, and then define relationships between those concepts in terms of what varies between one instance of the solution and another. You don’t define the whole thing in fine gory detail, just the parts that could be different for each implementation – the variability. You effectively define a language using the vocabulary o the domain. A language that can be configured.
The stuff in the solution that does not vary (the commonality) becomes a constant (a given) – it is assumed in the implementation. Something that can be applied rapidly with fixed templated assets and automated rapidly and highly consistently and correctly.
The stuff that does vary between implementations (the variability) will require that a practitioner provides a set of configuration for it that defines their choices based upon their requirements and constraints for this instance of the solution. These choices are then transformed or mapped during the solution development to an appropriate implementation by the toolkit using automation.
You would then define a set of instructions that would guide an practitioner in how to create the concepts of the solution, how to relate them together, and what configuration choices to make about the variability of the solution for their specific solution instance.
Based on all this kind of information, templates, assets, guidance and automation - a set of standardized custom tools is generated for you. That custom toolkit is packaged into a distributable installable package that can be given to a practitioner to redeliver the solution. The toolkit is then installed into the practitioners development/deployment environment. They create a new instance of that solution, build up the solution, guided by the toolkit, and make choices presented to them (provided by configuration models, wizards, etc.) about how the solution could be different for them. The toolkit would then provide contextualized guidance to help them make those choices, and automation to automate the transformation of those choices into a solution implementation.
OK, so where do you get tools like this from? What tools build the pattern toolkits?
Watch this space.