Common sense software engineering – Part IV: Life cycles and Agile
In Part IV of his ongoing series on common sense software engineering, blogger Steve Naidamast shares some wisdom he’s accumulated regarding agile methodologies. Newsflash for the young guns out there: what you’re doing isn’t new or revolutionary.
In the current vernacular we no longer speak about development life-cycles but instead about Agile and its variants. And if one were to review the documents promoting Agile and the components that make up what appears to be it’s framework it could be found that Agile is nothing more than a variant on existing life-cycles as defined by software engineering practitioners. Steven McConnell of Construx Software is aware of this since he wrote the book on standardised software engineering practices. And though he propose the use of Agile techniques his interpretation of these techniques are well founded upon long standing software engineering principles.
All a life-cycle represents, is a way to get from the start of a project to a successful conclusion. It is a fairly straight-forward concept. Yet, today when reading anything about current variations on such techniques we are provided instead with a wealth of arcane terminology that really doesn’t mean much except to those who are using it. Sprints, stand-up meetings, Scrum and others appear to hide the simplicity of Agile’s foundations instead of allowing new-comers to easily understand it’s potential.
Life-cycles are also not to be taken as hard and fast rules of development as there are a number of standardised models that can be applied as the development situation warrants. Agile is just one among many such life-cycles but it appears increasingly that its promoters believe that it is more or less a panacea for all software development related issues. And while it can certainly help in improving development efforts it can only do so under the right circumstances since, like any life-cycle model, Agile works best for those projects that it was designed for.
For example, if the reduction of risk to project failure is a very high priority, something that Agile is not designed to incorporate, than one may consider the “Spiral” life-cycle model, which can be illustrated with the graphic below:
When would Risk Management be a priority in a software development project; when life and death are critical concerns to a development team? Thus, for example such a model would be considered and has been used by developments in the aviation and aerospace industries.
The Spiral model is primarily comprised of incremental phases of the Waterfall model that is composed of the following attributes:
- Concurrent rather than sequential determination of artefacts: such as design documents, data models, workflow diagrams, test matrices and plans, setup scripts, source code modules. In this way, the Spiral model does imitate some of what current Agile techniques describe.
- Consideration in each spiral cycle of the main spiral elements: critical-stakeholder objectives and constraints, product and process alternatives, risk identification and resolution, stakeholder review, commitment to proceed
- Using risk considerations to determine the level of effort to be devoted to each activity within each spiral cycle
- Using risk considerations to determine the degree of detail of each artefact produced in each spiral cycle
- Using risk considerations to determine the degree of detail of each artefact produced in each spiral cycle: Life Cycle Objectives (LCO), Life Cycle Architecture (LCA), Initial Operational Capability (IOC)
- Emphasis on activities and artefacts for system and life cycle rather than for software and initial development
Again, the Spiral model as is shown is where risk to project failure must be the overriding priority as project failure at its worse could mean the deaths of innocents and an impact to the organisation that could lead to its bankruptcy. No one would fly a Boeing aircraft for instance if they had a habit of falling out of the skies on a regular basis as a result of poor risk management towards hardware and software component development.
The mention of the Waterfall model here must have raised the ire of some readers already but where risk management is less of a concern but a project requires long-term, in-depth planning, than it would be the appropriate method to use. For example, projects that have a foundation of industry standardised practices and processes as their primary criteria such as accounting systems would make use of a standardised ‘Waterfall model quite successfully as has already been done.
For example, though everyone in the accounting industry already knows what goes into the development of a good General Ledger product, the actual requirements and planning stages are far more important than the development phase since such complexities must be ironed out beforehand whereby the proper set of components can be effectively designed and coded. As such requirements tend be so complex, using a life-cycle model implementing small, incremental developments probably would not make much sense.
Despite popular opinion and massive marketing and public relations to support such opinion, the Agile life-cycle model is not always the appropriate choice for any given development criteria. Where it tends to be appropriate is in organisations that have low project risk criteria, low requirements for in-depth planning and design, and a need to move development into production quickly. Most such projects would also have low levels of specialised, unique complexities and areas that would be deemed complex would be based on known development capabilities such as database access.
However, good examples of where Agile could be successfully implemented would be in projects that reside at the departmental level of development since most teams at this level also have an inherent knowledge of the needs of their departments whereby actual in-depth planning and design can be minimised.
What business changes?
This again has probably stirred the ire of believers but think about it. How would you develop a complex application with high risk potential using modularised sprint processes that base design processes upon ongoing requirements modification to a larger end-product in an iterative process of evolutionary development? It simply cannot be done effectively and is one of the reasons why in recent years the Agile techniques has been called into question at the enterprise level as a result of its seemingly inability to scale based upon its original implementation definition.
A counter argument to such a contention would be that the nature of business is changing in the 21st century. However, such an argument is utter nonsense. It doesn’t matter how large or small a business is, certain fundamental processes and criteria will always be in play. In addition, no one has adequately defined exactly how business is in fact changing. It is true that due to recent technology implementations into daily life, businesses are speeding up their internal production processes but the underlying requirements for those needs are still basically the same as when substantial business concerns first began to develop in the 19th century.
The speed up in business processes does not also mean that somehow their basic requirements have changed for such operations. And speeding them up has in many cases caused undue harm while not benefiting many of the respective customer bases.
Let’s take a good example of this, which would be the daily use of debit and credit cards for most of our purchasing transactions. Up until the last few years when presenting your debit or credit card to a cashier they would in turn produce a paper document with your card number for signing to provide consent for the transaction. Very little of this information transfer was online or processed in such a manner. And there was really very little reason for this process to change since it did quite an adequate job of reducing the surface of attack for the average consumer. Debit/Credit card information was primarily limited to the people who handled the transaction, the documentation later being sent to the financial and retail institutions themselves that provided and accept these consumer cards. And people at the merchant rarely would abuse the process since they would be the first to be suspected of any wrong-doing.
Applying digital technology to everything is making everything into an arms race
Today, such critical personal information is flying all over the place on WiFi, merchant card readers, and the like making the attack surface much larger and causing far greater personal injury than in the past as capable technologists can literally break into or through such mediums and capture large caches of data to be decrypted as has been often done with bank ATMs. The same mindset that moved such daily transactions into the digital realm for profit purposes (ease-of-use is a non-starter when you think about it) has also taken severe hold within the companies that store such data where security breaches are increasingly becoming a daily occurrence, the successful ones being a result of poorly implemented security policies that were ignored probably due to concerns over those very profits.
It has been known for many years and is still accepted as an axiom by security specialists that sending data over air-waves is very precarious and subject to sniffing and subsequent theft. It has also been proven many times over that competent technicians can gain access to about anything sent over the public air-waves. If our strongest security systems in the intelligence and military organisations can be hacked it is far easier to do so then with profit oriented business concerns.
This is not to say there wasn’t such abuse with the older technique but it was not anywhere near as rampant. One would have to conclude here that due diligence towards risk prevention with the newer technologies was not adequately considered.
Instead, organisations of all types rushed to take advantage of emerging technologies and while doing so entered into the same realm of weapons development that military analysts must contend with while moving out of the more secure tenets of logistical supply-line security, which is paramount the military sciences.
In weapons development cycles, the developer of a system always knows that a weapon under development will eventually be superseded by variations in the designs of its opponents. The same is true with the modification of business processes that increasingly rely on technology, which is often considered extraordinarily fragile. An electronic process is designed by one entity and will foster designs by another to break it, which is exactly what happened with the use of debit and credit card transaction processes.
One cannot hack paper making the original method far safer than current ones.
This tangent was provided as a way to gauge the reality of the promotion of changing business requirements, which are increasingly coming from those who only see profit increases through the reduction of other processes by making them electronic.
To say it again, business requirements are not changing in the 21st century but the people who are running them are. They have become greedier, more arrogant, more demanding and far less competent as many recent examples of such personnel have constantly demonstrated. Professional Managers who have no vested interest in the development of a business rarely have the best interests of that concern as a priority; and this has been proven conclusively in many studies on the subject. Thus it is unlikely that many of these managers will make protecting such organisations and their data priorities in their planning no matter how much has been written deeming it necessary that they do so.
Much of this change has fallen upon the software development community in these same business organisations, which due to the sociological constraints within hyper-capitalistic societies prevented that community from taking a united step back to analyse such changes instead colluding with business operators in order to satisfy incoherent demands that were solely based upon trumped up perceptions of technological speed as some form of new wave of business operations of 21st century uniqueness. In the end the only thing that has changed in the new century is a massive increase in stupidity.
Enter the Agile Life-Cycle Model: A new sociological meme?
Enter the Agile life-cycle model as an attempt to accommodate this sociological change in business environments. Businesses wanted to be faster so software developers tried to find ways to be similarly faster not taking into consideration that a human being can only go so fast when involved in scientific/artistic pursuits and yield quality outcomes.
Thus, an increasingly popularised meme in business environments today is one that one way to develop software deliverables faster is to increasingly include all stake-holders in project development as if this was something new that had never been considered in software development previously. One only has to research the amount of documentation that has been developed over the years by software engineering analysts describing the best and most useful ways to include all project stake-holders in the development process.
And interestingly enough, a good bit of the popular Agile philosophy is targeted at supporting this new meme by promoting the increased interactions among the various groups involved in software development projects over the concept of “process being the priority”. What this double-talk exactly means is anyone’s guess or at least to those who still consider themselves rationally thinking individuals. No matter how you slice it, software development is first and foremost built around processes while most human interaction between the various groups involved are most often vital parts in the planning and design phases of any such project. However, you cannot have ongoing planning that constantly impedes with the processes of software development. Everyone who has experienced this knows exactly where it leads; feature creep, which has often been categorically shown to be the bane of any software development effort.
The idea that any team of software developers can develop a quality deliverable when they have already set themselves up for the destructive process of feature creep is literally irrational, which unfortunately is becoming more a part of daily life these days. And yet, this is exactly how the Agile life-cycle is defined in many organisations. However, that is not what it actually is. Walk into most medium or small organisations today and one quickly finds that Agile is a term that few have ever reconciled its realities with.
The Agile life-cycle is very much based upon the concepts of short, iterative steps that produce small parts of a large project while at the same time providing working implementations from these short steps that can or cannot be put into production at the conclusion of such steps. And this is the most cognisant part of the Agile framework; and it has been demonstrated that such a development process can be quite successful given the right circumstances.
Forget the rest of the nonsense about better inter-group relations and the like because this is simply not going to happen except as exceptions to the rule. First off, most general business managers are incompetent or they would be doing something else; at least in a society that emphasizes mathematics, the sciences, and technology disciplines such as engineering, which the US no longer does substantially.
In American society specifically this is not a happening thing right now since so many businesses successfully fostered the notion of the service economy back in the late 1990s and early 2000s. This tenet meant that we could eliminate all our basic disciplines and send them off to the hinterlands in other countries while we back in the US could exalt ourselves with simply doing advanced, intellectual contrivances. It was steadfastly believed that we had the Internet so who needs qualified American workers in the sciences and technology sectors when communications with such people thousands of miles away could be done much less expensively? Of course after successfully doing this for so many years while also making a mockery out of original US capabilities these very same businesses are now screaming talent shortages abound and that we have to import more qualified workers when so many of us are already right here. Go figure, human incompetence.
Given this sociological environment I wouldn’t count on too many successful group interactions as someone, somewhere is sure to screw it up for any project team. Concentrate on the process like you should be doing. You will get much further by doing so than by having too many cooks in the kitchen. It’s a nice idea but as one television character said a long time ago, “The committee is the most evil thing Humanity has ever developed.” It’s a good idea to keep that in mind the next time someone is promoting the inter-action of groups over the process.
In a word, it is complete nonsense.
Agile as nothing more than iterative/incremental development
Though Agile was as much a sociological reaction to changing sociological constructs in the business world, in reality, Agile is far from anything new. The modern incarnation of the Agile concept can be attributed directly to the failure of the Extreme Programming paradigm that rose and fell during the early 2000s like a lead balloon. Eliminating planning and design processes, while incorporating the idea of pair programming even failed the original project it was conjured up in, the Chrysler C3 Payroll System, which collapsed in the late 1990s (This is a good example of what happens when the wrong life-cycle model is applied to a project.). And whoever came up with the idea that two people could act as one while producing more at a faster rate while incorporating minimised code reviews at the same time should have a net thrown around him (or her) and taken off to a loony bin far, far away from planet Earth.
Though Agile rose out of the ashes of Extreme Programming, it is hardly a new concept. In fact, it was actually proposed as far back as 1970 when Dr. Winston Royce presented a paper entitled “Managing the Development of Large Software Systems”, which in essence became the foundation for modern-day Agile software engineering techniques.
However, the Agile framework that everyone is touting today has become largely overwhelmed with so much philosophical, marketing buzz that it sounds more like an ideology or a cult than an actual, credible life-cycle model as originally envisioned.
If you clear away all the nonsense that now surrounds Agile you will find a relatively clear view of what it really is all about and how simple it really is. And to do this we simply have to look at the graphics below, which demonstrate exactly what Winston Royce was proposing.
To begin with, Dr. Winston Royce first described what we have come to know as the Waterfall life-cycle, which was actually conceived in the early automobile industry’s adoption of the assembly-line. And like many of us who have used this life-cycle concept in the past, it is vulnerable to serious and costly failures but again such failures were also the result as to how this approach was applied. However, if you research this much maligned model, it does have its strong points that can be used within certain types of projects such as those noted previously. And in fact, it is the basis for the Agile life-cycle model being promoted today as shown in the subsequent graphic where Dr. Royce proposed Agile’s foundations as as an alternative.
The original Agile proposal as can be seen below, is extraordinarily simple in its definition and to this day basically comprises what Agile promoters are promising as a panacea to all development woes.
As can be seen in the graphic above, the Agile proposal is nothing more than what has become in software engineering practices known as iterative/incremental development.
As the following excerpt from the Wikipedia page for this life-cycle describes, this model provides for basically what Agile is supposed to be and what in fact at its core it actually is.
“The basic idea behind this method is to develop a system through repeated cycles (iterative) and in smaller portions at a time (incremental), allowing software developers to take advantage of what was learned during development of earlier parts or versions of the system. Learning comes from both the development and use of the system, where possible key steps in the process start with a simple implementation of a subset of the software requirements and iteratively enhance the evolving versions until the full system is implemented. At each iteration, design modifications are made and new functional capabilities are added.
The procedure itself consists of the initialisation step, the iteration step, and the Project Control List. The initialisation step creates a base version of the system. The goal for this initial implementation is to create a product to which the user can react. It should offer a sampling of the key aspects of the problem and provide a solution that is simple enough to understand and implement easily. To guide the iteration process, a project control list is created that contains a record of all tasks that need to be performed. It includes such items as new features to be implemented and areas of redesign of the existing solution. The control list is constantly being revised as a result of the analysis phase.
The iteration involves the redesign and implementation of iteration is to be simple, straightforward, and modular, supporting redesign at that stage or as a task added to the project control list. The level of design detail is not dictated by the iterative approach. In a light-weight iterative project the code may represent the major source of documentation of the system; however, in a critical iterative project a formal Software Design Document may be used. The analysis of an iteration is based upon user feedback, and the program analysis facilities available. It involves analysis of the structure, modularity, usability, reliability, efficiency, & achievement of goals. The project control list is modified in light of the analysis results.”
Despite the hoopla, this is what Agile is all about. And if implemented properly it will be as successful as any other well implemented life-cycle model. It should also be noted that iterative development does not entirely do away with our much maligned friend, the Waterfall approach, as surprisingly that approach is the basic foundation in some variant form to all other approaches. In fact, iterative and incremental development are essential parts of the modified waterfall models, Rational Unified Process, Extreme Programming and generally the various agile software development frameworks, all of which rely on the original Waterfall model to some degree. And the reason it is found in such models is because all the Waterfall approach is, is a set of definitive, sequential steps where by each step is completed before moving on to the next. Thus, this approach is then miniaturised in order to accommodate the iterative phases of such development processes as these phases in of themselves all require definitive, sequential steps that must be completed before moving on to the next stage.
Nonetheless, the approach of iterative development has been also expanded into variants over the years which can be found in the following life-cycle models.
- Evolutionary Prototyping: Evolutionary prototyping uses multiple iterations of requirements gathering and analysis, design and prototype development. After each iteration, the result is analysed by the customer. Their response creates the next level of requirements and defines the next iteration.
- Staged Delivery: Although the early phases cover the deliverables of the pure waterfall, the design is broken into deliverables stages for detailed design, coding, testing and deployment.
- Evolutionary Delivery: Evolutionary delivery straddles evolutionary prototyping and staged delivery.
As has been shown, the Agile life-cycle model is an evolutionary outgrowth of other comparative models and cannot stand on its own as something new that can be used across the spectrum of software development criteria. It is this effort to mould the Agile model into some form of silver bullet that can be the panacea for any situation where software development is continuing to fail to deliver credible results. No life-cycle can accomplish such a feat since they have all been designed for different sets of requirements by an array of uniquely different types of projects.
Again, it should be noted that Agile has had problems in scaling up to be able to accommodate large projects in the enterprise. However, this inability has little to do with the foundations of the Agile methodology but instead what professionals have attempted to eliminate from it.
Even within a highly iterative development process, requirements analysis, planning, and design processes are ultimately required in order to make each iterative phase work successfully and meld properly into the whole. Agile can do this as long as these initial phases are considered as necessary to the degree that any large scale project requires.
Recently, it has been announced that Microsoft follows an Agile style of methodology across its various divisions as if to again promote the idea that Agile can handle any level of software development. However, if one were to read the article there is nothing describing how each product team that has supposedly adopted Agile is using it. However, we can make some educated guesses.
The article emphasises specific product teams that have moved to the Agile development style such as “Team Foundation Server”. This product, like many other products in the Microsoft repertoire is a very solid tool that defines and handles processes to manage a team’s source control requirements. The product, even in its re-written versions has been around for quite a while and is the result of in-depth planning and design processes; meaning that using Agile did not build the original product (especially considering that major parts of it came from Borland International’s Star Team software). However, Agile is perfectly suited, if properly implemented, to maintain and refine the product’s base code. And this holds true for many of the other product areas where Agile is being adopted or has already been implemented since an iterative life-cycle model works quite well with ongoing maintenance and refinement of existing core application bases.
To add a refinement to such an application, which has already known attributes for its design tenets requires little in the way of upfront design except for the individual refinement under consideration. And there is little actual planning that is required if, one, the refinement has already been agreed upon and two, it has passed muster with the required analysis of the application to ensure that it would fit well within the overall scheme of the design. So having morning stand-up meetings and then quickly moving on to a development sprint should work quite well.
However, try adopting this same technique to the development of a new operating system. Not so easy.
In closing, software professionals should not delude themselves into believing that because many of you may be from a younger generation that everything you are experiencing or even creating is somehow brand new. It isn’t. No matter the era in computing, quality software is still created using well tested paradigms that have been around for many years; many before you were even born (ie: hypertext’s underlying mathematical foundations were designed in the 1940s).
Nothing in the Information Technology field suddenly popped up overnight. Much of what you are working with today has been done and refined in many ways for many years. Many of the new software techniques that are being promoted as the new in-thing are merely rehashes of existing paradigms that have been used in software development projects spanning over the years. Agile as a life-cycle model is no different; it just has been given a lot of new bells and whistles; many of which are entirely unnecessary.
It is also not important which life-cycle model that is chosen to develop a new project whether it be from the ground up or a refinement to an existing application as long as the choosing is done wisely. For large complex projects, use proven models that have been found to work with such large-scale development even if it means that the project may require the use of the Waterfall approach due to the requirements for in-depth planning and design. The same for small projects where an over reliance on upfront planning and design stages will probably hinder the results.
As an example of the efficacy of different life-cycle models, please review the chart below as it shows a number of different software engineering models that have had their primary attributes researched over the course many years involving statistical analysis of how well each attribute serves each specific model. As it will be noted, each life-cycle model has its advantages and disadvantages. And yes, the chart is from a 1996 publication. Guess what? That publication is “Still the One!”
Organisations that proudly promote that they are Agile are promoting themselves beyond the realm of rationality since one size in this case cannot and does not fit all; unless of course the only projects that are being developed are ones that fit into the Agile methodology. It is, in reality, supposed to be the other way around.
A life-cycle model selection is very much a result of a careful analysis of what an individual project actually requires to bring it to a successful completion. There is no getting around this and those that are attempting to do so are seeing the undesired results.
This article first appeared on Tech Notes, Black Falcon Software’s technical articles for .NET Development.