Export (0) Print
Expand All

CMMI Principles and Values

By David J. Anderson. David J. Anderson is the author of two books, “Agile Management for Software Engineering: Applying the Theory of Constraints for Business Results” [1] published in 2003, and “Kanban: Successful Evolutionary Change for your Technology Business” [2] published in 2010. He was a member of the team that created the Agile software development method, Feature-Driven Development, in Singapore between 1997 & 1999. He is an author of the Agile project management principles defined in the Declaration of Interdependence, 2005. David created MSF for CMMI Process Improvement, and co-authored the Technical Note from the Software Engineering Institute, “CMMI or Agile: Why Not Embrace Both!” [3] He is Vice President of the Lean Software & Systems Consortium, http://www.leanssc.org and leads an international management training and consulting firm, David J. Anderson & Associates Inc., http://www.agilemanagement.net that helps technology businesses improve their performance through better management policies and decision making.

January 2012

Anderson describes how that looking at organizations through a CMMI lens provides valuable insights for managers, process engineers and all external stakeholders including customers, investors, governance bodies and auditors.

Application Lifecycle Management; CMMI

Introduction

The Meaning of Organizational Maturity

Inspiration for the CMMI Model

CMMI is a Model

Understanding CMMI Made Simple

CMMI Appraisals

The original Capability Maturity Model (CMM) was first published in 1991. It evolved into the Capability Maturity Model Integration (CMMI) a decade later. CMMI is now a family of three constellations (as they are known) and this article specifically refers to the CMMI for Development constellation (CMMI-DEV). The CMM was developed to help the United States Department of Defense better understand the costly failures in software projects within large-scale government procurement programs. An assessment based on CMM was used to determine “fitness” for government contracting. Later this evolved into a defined appraisal scheme based on CMMI.

The concept of organizational maturity remains controversial. How for example, can the maturity of an organization be assessed? And does an organization truly have behavior that is distinct from the behavior and actions of its individuals? The concept that an organization can be appraised at a particular maturity level and that this is an indicator of capability to deliver reliable work to the government is a matter of ongoing debate. However, I remain a believer and supporter of the CMMI and believe that looking at organizations through a CMMI lens provides valuable insights for managers, process engineers and all external stakeholders including customers, investors, governance bodies and auditors.

Maturity in CMMI is intended to imply an approach and ability to assess and manage risk and the judgment used when making decisions. The term “maturity” is actually being used in the sense of its common usage with respect to individuals. For example, insurers can tell us that 18 year old males are more likely to have an automobile accident than 55 year old women. The 18 year old male is likely to exhibit poorer judgment when making decisions regarding the handling of a vehicle and is likely to have insufficient experience as to adequately assess risks involved in a particular course of action. This can result in accidents which a 55 year old woman might avoid. Insurance firms are particularly good at assessing risks because they collect statistical data and correlate evidence.

One issue with CMMI is the perceived pejorative nature of the term “maturity.” It is always assumed that more maturity is better than less. And that greater maturity should always be pursued. If we were to think of this in individual terms it does not always make sense. An insurance firm might price car insurance more cheaply for the more mature; however, an open-wheel racing team is likely to value youthful exuberance and risk taking. For the racing team, car crashes are part of the business. In fact, drivers who never once crashed a vehicle would be fired.

The message here is that required levels of maturity should match the circumstances and context. More is not necessarily better, but being able to correctly identify and assess organizational maturity helps in assessing whether a business is capable of managing risk and exercising appropriate judgment when making project and product decisions.

There should be a strong correlation between level of maturity and likelihood of achieving or delivering a desired outcome. A high-maturity organization should have a very strong chance of delivering an outcome that is close to the desired outcome. This includes having the maturity and capability to assess possible, likely, and achievable outcomes and to set goals accordingly. A lower-maturity organization is less likely to set achievable goals and is less likely to deliver a desirable outcome against an expectation. The flip side of this coin is that a high-maturity organization may become risk averse and only ever set easily achievable goals, while an exuberant lower-maturity organization may achieve exceptional performance through a combination of luck and hard work.

The original CMM was developed by Watts Humphrey and first appeared (without the CMM name) in his book Managing the Software Process[4]. It was inspired by the 20th Century manufacturing quality assurance movement and the work of Joseph Juran, W. Edwards Deming, and Philip Crosby. The term “maturity model” and the five levels within it were inspired by Crosby’s Manufacturing Maturity Model. However, CMM must be seen as a true synthesis of ideas. The term “capability” is almost certainly inspired by Deming.

Deming used the term “capability” with a very specific meaning that runs much deeper than perhaps the common language understanding of the word. More accurately, a “capability” might be considered the “natural philosophy” of a system or operation within a system. Deming encouraged managers to “study capability” and to analyze it statistically. He developed his System of Profound Knowledge[5] which is intended for use as a decision framework to help managers make appropriate systems design interventions. Deming was a true systems thinker. In this case, the word “system” implies a process performed by people. There is no intent to imply a technology or any automation.

Deming believed, and had compiled evidence, that suggested that as much as 95% of the performance of a system came from the design of the system and not from the capabilities of the individuals working within the system. In other words, to create improvement, there must be a focus on changing the process, the system within which the people work, rather than a focus on improving individual performance. As a result of this, he did not believe in targets, management by objectives, motivational posters, or punishing individuals for poor performance.

So Deming had provided a Capability Model with his System of Profound Knowledge, and Crosby provided a Maturity Model. Humphrey sought to synthesize these concepts together and apply them to the field of software engineering, and the result was the Capability Maturity Model.

Given the focus on assessment and the pursuit of maturity levels to qualify for government contracts, the Capability Modeling and the influence of Deming largely disappeared from much of the literature on CMM and CMMI. However, Deming’s influence can be clearly seen in the process areas defined in the higher maturity levels.

CMMI was always intended to be a framework to encourage the emergence of a culture of continuous improvement within an organization, and it was always intended to be a systems thinking approach. There are very evident parallels to Lean in the roots of CMMI.

The CMMI is a model for understanding organizational maturity and capability. It is not a standard, nor is it a software development or project management process definition. The generic practices described in this article refer to the process capability and not any given project or product under development. For example, where planning is referred to in the following table, it refers to planning for the process implementation and not for any project or product delivery.

The CMMI model consists of 22 process areas, plus three generic goals which all organizations are expected to pursue.

The 3 generic goals are as follows:

Generic goal

Generic practice

Purpose

GG 1 - The process supports and enables achievement of the specific goals of the process area by transforming identifiable input work products to produce identifiable output work products.

GP 1.1 - Perform the specific practices of the process to develop work products and provide services to achieve the specific goals of the process area.

The underlying assumption is that predictable outcomes come from following a process.

GG 2 - The process is institutionalized as a managed process.

GP 2.1 - Establish and maintain an organizational policy for planning and performing the process.

Management should support GP 2.1 by encouraging the creation, maintenance, and usage of a process. There is an identifiable management policy/memo stating that a process should be followed to ensure predictable outcomes.

GP 2.2 - Establish and maintain the plan for performing the process.

There is a plan for ensuring that new activities adopt the process and follow it.

GP 2.3 - Provide adequate resources for performing the process, developing the work products, and providing the services of the process.

Management truly supports the following of a process and sets projects up for success by resourcing them appropriately.

GP 2.4 - Assign responsibility and authority for performing the process, developing the work products, and providing the services of the process.

Senior management delegates power and sets roles and responsibilities on the project to ensure that the process is followed and the work product anticipated is produced.

GP 2.5 - Train the people performing or supporting the process as needed.

A training program is in place to ensure that project staff are adequately skilled to perform the tasks asked of them and to deliver the process area capability desired.

GP 2.6 - Place selected work products of the process under appropriate levels of control.

There is configuration management and document management for all essential artifacts in the creation of the work product, e.g., requirements management and tracking, source code version control, and environment configuration control.

GP 2.7 - Identify and involve the relevant stakeholders as planned.

All necessary stakeholders are involved. Foreseeable risks are identified as a result.

GP 2.8 - Monitor and control the process against the plan for performing the process and take appropriate corrective action.

This is related to GP 2.2 and asks that the plan for following the process be monitored to show the plan was carried out. For example, if the plan requires a process engineer to meet with the project manager to tailor the definition, then, did that meeting happen?

GP 2.9 - Objectively evaluate adherence of the process against its process description, standards, and procedures and address noncompliance.

Monitor that the process is being followed rather than circumvented or ignored. Consider modifying the defined process if it doesn’t match operational realities.

GP 2.10 - Review the activities, status, and results of the process with higher-level management and resolve issues.

Maintain senior management involvement and support. Conduct a form of operations review with senior management and compare capability with expectations and requirements. Consider whether resourcing and training are adequate and take actions to resolve process definition or process area capability problems as necessary.

GG 3 – The process is institutionalized as a defined process

GP 3.1 – Establish and maintain a description of a defined process.

In order for the process to be repeatable and followed as intended, there needs to be a written description.

GP 3.2 - Collect work products, measures, measurement results, and improvement information derived from planning and performing the process to support the future use and improvement of the organization’s processes and process assets.

Manage the appropriateness of the process through quantitative means, and evolve it as necessary to meet needs as they unfold.

The 22 process areas are arranged into 4 categories: Engineering, Project Management, Process Management, and Support. Each process area consists of one to three specific goals and all three generic goals. For each goal, a number of practices are commonly expected in order for that goal to be realized. Within a practice, there may be suggested sub-practices. The CMMI only requires or prescribes the goals. The practices defined within the goals of the CMMI model are expected but not mandatory. If not present, they must be replaced by an equivalent replacement practice. The following table shows the grouping of process areas:

Organizational focus

Process area

Engineering

Requirements Development

Technical Solution

Product Integration

Verification

Validation

Project Management

Project Planning

Project Monitoring & Control

Integrated Project Management

Supplier Agreement Management

Requirements Management

Risk Management

Quantitative Project Management

Process Management

Organizational Process Focus

Organizational Process Definition

Organizational Training

Organizational Process Performance

Organizational Innovation & Deployment

Support

Configuration Management

Process and Product Quality Assurance

Measurement and Analysis

Decision Analysis and Resolution

Causal Analysis and Resolution

So the principle is simple: if an organization can exhibit the capability to achieve the goals within each process area, then they can be said to have a capability at that particular process area.

The process areas are also grouped into maturity levels, which provide a shorthand method for describing maturity. Although the grouping of process areas into levels remains controversial on several levels, my observation of organizations over the last two decades is that the current version 1.3 of the model (considering the CMMI is effectively version 2 of the CMM) is broadly correct. Low-maturity, chaotic organizations tend to develop capability at process areas defined in maturity level 2 before they develop capability in process areas defined at higher levels.

The following table shows the grouping of process areas into levels.

Maturity level

Process areas

5

CAR – Causal Analysis & Resolution

OPM – Organizational Performance Management

4

OPP – Organizational Process Performance

QPM – Quantitative Project Management

3

RD – Requirements Development

TS – Technical Solution

PI – Product Integration

VER – Verification

VAL – Validation

IPM – Integrated Project Management

RSKM – Risk Management

OPF – Organizational Process Focus

OPD – Organizational Process Definition

OT – Organizational Training

DAR – Decision Analysis & Resolution

Process & Product Quality Assurance

2

CM – Configuration Management

MA – Measurement & Analysis

SAM – Supplier Agreement Management

PP – Project Planning

PMC – Project Monitoring & Control

RM – Requirements Management

1

There are no process areas within model level 1. Level 1 represents an undefined process with no introspection or capability to define a process or repeat an outcome through understanding of the process that produced it. Technically, in a CMMI appraisal, an organization that does not meet the goals for the process areas in model level 2 is still model level 1. So organizations with emerging processes will technically still be viewed as model level 1 even though they have matured a long way from undefined chaos.

The following table presents an overview in lay terms of the subject or purpose of each process area:

Process area

Purpose

CAR – Causal Analysis and Resolution

Investigate the root cause of exceptional process problems (special cause variations, to use W. Edwards Deming’s term), and suggest and implement process changes to prevent a recurrence. Attention is directed to unusual behavior of quantitatively understood, stable processes. Everyday surprises would probably be considered part of Risk Management (RSKM) rather than CAR.

CM- Configuration Management

More than just source code version control, this process area covers all administration relating to system environments, configurations of components, platforms, middleware, applications, and documentation. The ability to successfully build and deploy working code falls within this process area.

DAR - Decision Analysis and Resolution

For all key decisions within a project or product development, show that a set of alternatives or options were considered and that contextual elements were used to assess the suitability of different options. Record the decision and the reasons for the choice.

IPM - Integrated Project Management

This second level of project management within the CMMI model implies that an organization is capable of managing multiple potentially dependent projects simultaneously. This is often achieved through the use of a program or portfolio management office.

MA – Measurement and Analysis

Collect data on process, project, and product performance. Produce metrics and indicators in the form of reports based on the data.

OPD - Organizational Process Definition

The organization should have one or more process definitions that are defined within a context. A context will describe a risk profile. Each project can be assessed for its risks and a process definition selected from the organizational catalog and then tailored appropriately.

OPF – Organizational Process Focus

The organization should believe that process definition defines and affects capability and that improving capability is primarily driven through improved processes. Consequently, the organization proactively manages its process definitions and monitors (using the PPQA process area) to insure that these definitions are followed.

OPM – Organizational Performance Management

This process area encapsulates the concept of a statistical understanding of how well a process delivers against its expected capability. Changes to the process intended to improve capability can be assessed and the underlying model for the process considered if observed results do not reflect those predicted by the underlying process model when a change to the process definition was made. The organization manages its performance via its processes in order to meet its business needs.

OPP – Organizational Process Performance

This process area encapsulates the concept of comparison of performance, often referred to as “benchmarking.” OPP creates process models from baseline data to enable comparison. This gives an organization the ability to answer questions such as “Which of our three product teams should we choose for [this specific project]?”

Organizational Training

Individual capability at specific practices is important for process performance and system capability. A well-behaving system with strong performance will have a strong training ability to reduce variability in capability at the local practice level.

Product Integration

The ability to integrate multiple components to form a complete product and to manage the elements required to make it possible.

Project Monitoring and Control

Gather data about ongoing projects, compare against plans, projections, and simulations, and take appropriate actions based on the data.

Project Planning

Plan projects based on estimates, simulations, and analysis of requirements.

Process and Product Quality Assurance

Primarily a process conformance audit function. Intended to demonstrate that the system is operating as designed. Helps avoid potential management mistakes to change the process to correct a problem when actually the current process is not being followed as intended.

Quantitative Project Management

This is the third and highest level of project management within the CMMI model. It implies that statistically sound, quantitative methods are used to plan, monitor, and manage projects.

Requirements Development

There is a defined, repeatable process for soliciting, negotiating, analyzing, and documenting requirements.

Requirements Management

Requirements are tracked throughout the project lifecycle and there is, ideally, end-to-end, traceability between a delivered configuration and the original requirement request.

Risk Management

Although the whole CMMI model can be seen as a framework for managing risk, this process area specifically addresses, “event-driven risk” or the likelihood and impact of special cause variations and everyday surprises. This process area requires risk reduction, mitigation, contingency planning, issue management, and resolution.

Supplier Agreement Management

The ability to manage external vendors, define agreements, manage the contract, and take delivery of the desired product or service.

Technical Solution

All the skills required with respect to software architecture, design, and coding.

Validation

Acceptance testing against customer requirements.

Verification

Testing against a design (from Technical Solution). Seeks to ensure that the product being built is as envisioned, designed, and put together in such a way as to meet the users’ needs and work in their environment.

For each process area, a capability level can be assessed. Four levels of capability are defined in v1.3 of CMMI:

0. Incomplete

1. Performed

2. Managed

3. Defined

If each specific goal is met and some of the generic goals are met, the capability for a specific process area will be appraised as at least level 1 – performed. Capability level 1 implies that team members know what to do and are doing it. However, the specific practice is unlikely to show stability if analyzed statistically. Practices are being followed, but there is still a lack of consistency. Capability level 2 implies that the team understands how something works and has a level of skill that makes the performance of a practice predictable and likely to exhibit statistical control. It is likely that training or collaborative working practices are in place to produce greater consistency across team members. Capability level 3 implies a mastery of the skill and an ability to develop new and improved techniques that will achieve the goal. Level 3 capability implies that any statistical analysis would require re-baselining after each explicit change in the underlying technique or practice.

So the CMMI model is basically stating that new and immature organizations will, at first, develop capabilities in practices for managing configurations and source control, gathering data about projects and the work they are undertaking, planning projects, tracking requirements, monitoring project progress, and taking actions based on comparing actual data against a plan. This is the essence of maturity level 2.

As maturity level 2 capabilities fall into place, the organization and its people turn their attention to other concerns, so capability at defining requirements, testing, architecture and design, integration and defining processes so that they can be repeated begin to emerge. As things stabilize more, an understanding of how culture and management style affect performance emerges and hopefully a comprehension that a systems-thinking approach is required to deliver further performance improvements. With things becoming more stable and day-to-day issues such as planning and project monitoring becoming second nature, there is time to consider risk management and analyzing alternatives and options before making decisions. Coordinating multiple dependent projects and improved governance of shared resources may emerge. Perhaps a training program, a mentoring scheme, a master-apprentice tutoring scheme, or simply ritualized forms of collaborative working will emerge to improve capability and raise the overall level of performance. If necessary, some internal audit or process quality assurance function might emerge. All of this is the essence of maturity level 3.

When an organization runs at a solid maturity level 3, things run like clockwork. The organization delivers on its promises and is seen as very reliable and dependable. A high level of trust emerges in relationships with customers. Senior managers start to ask questions such as "Where should I invest for further improvement?" and "Which team produces the best economic performance?" Managers start to develop more advanced insights into capability and performance and realize that they can use simulation and statistical analysis to improve product quality, customer delivery, and satisfaction. Management decisions are now entirely objective and defensible with statistical data. This is the essence of a maturity level 4 organization. For most senior managers, maturity level 4 represents their ideal state. Everything runs like clockwork and they have comparative performance data and are able to deliver against promises with a high level of accuracy. Economic performance is greatly improved and the organization’s performance is highly predictable.

Maturity level 5 behaviors often emerge long before an organization actually reaches level 5. Root cause analysis is often seen in maturity level 3 organizations. What makes it a level 5 capability is whether the root cause analysis is done using quantitative data and is statistically defensible. The emergence of a formalized process for process innovation and deployment of improvements may also occur before the organization could truly be considered to be maturity level 5. At level 5, process improvement has been institutionalized and inculcated into the culture of the organization. The culture is one of always challenging the status quo and looking for improved capability, improved product quality, and improved economic performance.

A CMMI maturity rating is established by an appraisal. There is a standard process for performing appraisals, SCAMPI – Standard CMMI Appraisal Method for Process Improvement. This was introduced to bring some repeatability to the process and some trust in its outcome. The three levels of rigor in appraisals are referred to as Class A, B, and C, where class A is the most rigorous. Class A appraisals are required for a model level rating that is acceptable for public record or United States Department of Defense requirements.

All Class A appraisals and most class B and C appraisals are conducted by CMMI Lead Appraisers, who are authorized by the Software Engineering Institute to conduct appraisals. These consultants have been through a thorough training program before being licensed to practice. Some appraisers have been through additional training and are designated as CMMI High Maturity Lead Appraisers. Organizations that seek a model level 4 or 5 appraisal must work with a High Maturity Lead Appraiser.

Appraisals seek evidence that practices have been conducted to achieve the goals within the process areas of the CMMI. Within an organization running a portfolio of projects and perhaps with several business divisions, a complex formula is used to determine how many projects of which scope must be appraised. The objective of this is to ensure fair coverage of a sample set of projects that demonstrate that the organization has an institutionalized capability within each required process area. The Lead Appraiser will determine the projects to be appraised based on this formula.

Within each project being appraised, evidence must be collected that demonstrate practices, required to show sufficient capability within the process area, have been completed. For each practice, the appraiser looks for hard, tangible evidence, which is known as artifacts and is often found in documentary evidence such as plans, source code, designs, and architecture documents. In addition, they look for affirmations. An affirmation is generally hearsay evidence, such as staff members talking about the conduct of a practice, such as anecdotes that describe attending a planning meeting. Affirmations are collected by interviewing staff involved in the projects being appraised. Affirmations may reinforce the documentary evidence or refute it, leading the appraiser to question the validity of the documentation.

CMMI appraisals are not required for the CMMI model to be useful. The CMMI helps software development organizations understand their capability and maturity and compare that against the expectations of their customers and other external stakeholders. Having a rough idea of where an organization maps against the CMMI model provides a way of assessing how it might react under stress and its ability to deliver against expectations. Organizations observed to be performing higher maturity activities while not having a solid foundation of lower maturity behaviors can often be unpredictable. That is while high maturity behaviors are present and this is commendable, these high maturity practices are not reliable as they are not build on a solid foundation.

CMMI appraisals are often used as a way of validating an organization-wide process improvement initiative. This creates a pressure to “pass the test.” The focus becomes one of demonstrating that each practice within each process area is followed and in showing evidence of such. There can be a loss of focus from what is really important – showing capability against customer expectations – and improving that capability through explicit management actions. This focus on “passing the test” has often led to significant side-effects and dysfunction in organizations. As a result, the CMMI has developed a strong body of detractors in the industry. This is a pity as I strongly believe that the CMMI model is valid and that it provides valuable insights for the managers within an organization – insights that should lead to improved capability, improved performance and improved customer satisfaction.

[1] Anderson, David J., Agile Management for Software Engineering: Applying the Theory of Constraints for Business Results, Prentice Hall PTR, 2003

[2] Anderson, David J., Kanban: Successful Evolutionary Change for your Technology Business, Blue Hole Press, 2010

[3] Glazer, Hillel, and Jeff Dalton, David J. Anderson, Michael D. Konrad, Sandra Shrum, CMMI or Agile: Why not embrace both!, Software Engineering Institute, November 2008

[4] Humphrey, Watts S., Managing the Software Process, Addison Wesley Professional, 1989

[5] Deming, W. Edwards, The New Economics for Industry, Government, Education, 2nd Edition, The MIT Press, 2000

Show:
© 2014 Microsoft