11. July 2014 · 2 comments · Categories: CMMI · Tags:

In continuation to the earlier article Easily ignored certain key points in CMMI implementation,  this article talks more on misconceptions in CMMI implementation. Misusing the model results in unproductive or unnecessary process, waste of resources, loss of credibility towards upcoming process improvement events and finally frustration and dissatisfaction. These could be avoided through proper senior management attention and clear understanding of the standard/model. The common misconceptions observed during CMMI implementation are explained below.

1. Treating CMM as a fast answer for short-term problems

We have a tendency to reach a Maturity Level ‘N’ soon. Adopting the CMMI will not double the productivity all of a sudden. In fact during the initial period of implementation, there will be negligible improvements or rather we can say that there will be an inverse relationship. But once the improved practices are incorporated into the current development process and sustained these practices, visible improvements are observed with CMMI adoption.

2. Treating CMMI as a standard rather than as a process improvement guide

Days and nights of hard work resulted in the successful completion of CMMI appraisal. What is next? Parties and get-together. We can hear people telling like ‘let us relax now’. But sadly the relaxation period/ break time extends till the next SCAMPI A. And it is often seen companies going down instead of improving after the appraisal, because everyone gets relaxed and next audit is far away, so people tend to move away from the processes that were created with lot of hard work. It is because the improvement efforts were put on the projects in the focus of appraisal only rather than on meaningful long lasting organizational wide improvements. Senior management must pay special attention on the value of process improvements rather than the value of rating.

3. PA implementation team focusing merely on specific PAs

Normally it is observed certain work division in the form of Process Areas (PAs) in the CMMI implementation team. And interestingly these mini teams tend to do exactly that. i.e. they focus on specific PAs. They simply ignore the relationship with other PAs which they are not supposed to do. They prepare and define processes without having any common connection with other processes. These process definitions are written as if a person or a team would perform one process in total isolation from other organizational processes. Another problem is that these PA focused team either ignore the sub practices or implement them literally. They use the CMMI language itself while making procedures or policies. They make process audit Check Lists using the CMMI sub-practices /CMMI terms directly. At a later point of time when these are deployed in the organization, it won’t be easy for the people and customers to tie it to their work. Also these PA focused team tends to ignore the institutionalization aspects of process improvement. CMMI provides more explicit guidance for addressing the institutionalization aspects of processes through its Generic Practices.

4. Treating Requirements management (REQM) same as Requirements Development (RD).

Requirements management, a ML 2 PA, focuses on establishing and maintaining an agreement between the customer and the project team on the requirements. It also covers documenting requirements changes and maintaining bidirectional traceability between source requirements, all product and product component requirements, and other specified work products. However, this PA does not deal with requirements development, the tasks of eliciting the requirement, analysing them and documenting them, and so forth. These tasks are handled by RD

5. Measuring everything, but not the required

We start any new model implementation with high spirit, energy and good support from senior management. But later, most often, less involvement of management is found. They tell like they are not getting any benefit from the system. In other words, process improvement is not visible to them. Implementation team is not able to answer to their questions which are pointing on return on investment or cost benefit ratio. A qualitative answer won’t satisfy them. Here what we need is a reply in terms of ‘useful numbers’. We might be measuring something, but is that required for the senior management/ is that useful for the organization? Most probably we might have ignored those aspects and we simply measured as it is measurable.

6. Considering tailoring as a way to escape from certain defined process

People often apply organizational standard process to different projects uniformly without tailoring. In such cases organizations don’t offer project a choice of process elements with a range of capabilities to select with. Another case is people think that having a “tailorable” process means, one can do whatever he/she wants. At the Defined Level, a standard process is developed for the organization, as well as guidelines and criteria for tailoring this standard process for each project. Each project’s defined process must be adapted in a structured way from the bigger organizational standard process.

7. ML 5 PAs interpreted qualitatively and not quantitatively.

It is easy to interpret Maturity Level 5 Process Areas as qualitative. One can assume that: Casual Analysis and Resolution (CAR) is all about brainstorming causes and Organizational Performance Management (OPM) is about making qualitative improvement. But in fact qualitative improvement is the focus of ML3 Organizational Process Focus (OPF) PA. Maturity Level 5 practices must be built upon the stable ML 4 practices.

8. Hoping a metric or statistics wiz can do it all

Managers often assume that a metrics person can do all Quantitative Project Management (QPM) and thus allowing project managers to focus on their regular day-to-day tasks. But it is primarily the managers’ job to run projects quantitatively and sub processes statistically. It is the managers who have to take base decisions and predict ahead. Definitely metric team can support these managers.

9. Senior managers merely focussing on ordinary project status reviews

Getting full support from senior management is crucial for the success of any new model implementation. The company executives need to be well clear on the advantages, requirements and costs etc. But in majority of the cases, it is found that senior managers are merely focussing on ordinary project status reviews rather than involving in process issues. They often fail to see the value of incremental improvements. CMMI implementation effort must receive high priority and continuous senior management attention.

10. Assumption: No need of training, our team can handle it

Considering the short term cost benefits or so, employees (implementation team) are often imposed to have self-study of the new model instead of a formal training. Ultimately it may lead to wrong interpretations and unwanted results (Exceptions could be there). Implementing a change is never a small thing and it can happen only when the team implementing it is trained to handle the change. So investing on appropriate trainings is always a better move. Managers at all levels , leaders of CMMI efforts and implementation team must have basic understanding of what the CMMI is and how it should be used

11. Assumption: The QA team can take it forward

A common trend that we have observed over years is that project manager does not really take responsibility of process activities in the project and it is left to some QA Manager or Process team. These managers are often forgetting the fact that ‘it is the team effort that brings in real benefits’. If they are not serious about process improvements, how can they expect the team to do it?

12. Assumption: The CMMI mandates wasteful paperwork

The CMMI states often about carrying out an activity according to a documented procedure. This doesn’t mean that we need a number of procedures. Processes are defined to ensure repeatable, consistent and effective output. Defined processes are not intended to be barriers to productivity. Rather than dogmatically applying everything in the CMMI to every project, apply the activities that will add the most value to each project.

13. Appraisal Preparation: Cooking up evidences

When the appraisal date approaches, people are engaged in creating lot many evidences. In fact they are wasting a lot of time chasing down documents. If practices are institutionalized appropriately, the evidence needed already exists. Evidence should never be created to please an appraiser. Artifacts examined must be the real work of the organization

14. Appraisal Interview Preparation : Conducting mock interviews

In many a cases, it is seen some master brain interviewing the appraisal participants / project team and thereby trying to make them equipped for the real appraisal interview. In fact these mock interview sessions will spoil the real stuffs of the team who are actually practicing the processes. Also time to practice for an appraisal takes away the team from getting real work done. Participants should be able to answer the questions because the answers describe how they do their jobs.

15. Thinking that CMMI Institute certifies an organization.

CMMI Institute does not produce any certification as a part of CMMI appraisal. CMMI Institute maintains a database of SCAMPI A appraisal results. An organization can have a certificate issued by the transition partner organization with whom they are associated and not by CMMI Institute

04. July 2014 · 2 comments · Categories: CMMI · Tags:

Proper interpretation and understanding of a model is a vital part in any new model implementation in an organisation. During the course of action, we have a tendency to over simplify things and thus ignore certain key aspects. And in certain cases there happen some misconceptions too. This article tries to clarify such ignored aspects and the next article will talk more on misconceptions in CMMI implementation. There might be some overlapping between these two articles.

The below aspects are easily skipped, not because they are too tough to implement but because, they are too simple to implement.

  • GPs – Ignored
    • The Generic Practices (GP’s) represent the most important part of the model. Poor implementation of GPs will always lead to process failure. Usually we will try to map all specific Practices (SPs) of each PA and simply ignore GPs. In some cases GPs will be mapped automatically through SP implementation itself, but not in all cases. GPs are the most beautiful aspect of the CMMI model as it talks about institutionalising the process
  • CM – Oversimplification
    • The Process Area (PA), Configuration Management (CM) appears clearly More »
30. June 2014 · 2 comments · Categories: CMMI · Tags:

CMMI Assessment is the process for evaluating compliance and measuring the effectiveness of Specific Practices (SPs) and Generic Practices (GPs) of Process Areas (PAs) in the CMMI Framework. Every assessment process starts with scope definition and finally leading to assessment and sustenance. These common procedures are well explained in Stages in an assessment/certification process .

In additional to this common procedures, there are certain specific points to be ensured if an organization is seeking for compliance to CMMI model. These are detailed below.



Organization needs to have two separate teams- Functional Area Representative (FAR) Team and Assessment Team. These members are elected from/by the organization.

FAR team is further subdivided to multiple sub teams based on More »

27. June 2014 · Write a comment · Categories: TL 9000 · Tags: ,

TL 9000 is a quality management practice set up by the QuEST Forum in 1998 for telecom industry. TL 9000 is grounded on ISO 9001 with industry specific adders.  It adds specific telecom hardware (H), software (S) and service (V) requirements to the more generic practices of ISO 9001. QuEST forum administrates the TL 9000 certification process (www.questforum.org ). The QuEST forum was originated by a number of telecommunication companies, e.g. Bell Atlantic, BellSouth, Pacific Bell and South Western Bell. QuEST stands for ‘Quality Excellence for Suppliers of Telecommunications Leadership’

TL9000 is defined by two documents.

  • TL 9000 Quality Management System Requirements Handbook
  • TL 9000 Quality Management System Measurements Handbook

There are certain specific points to be ensured if an organization is seeking for compliance to this standard in additional to the common procedures as explained in Stages in an assessment/certification process . The specific points are explained below.

1. Registration Options

The process is the same as other registrations (e.g. ISO 9001) except that the scope of registration includes reference to product categories chosen and adders (H, S, V) involved. The registration options available are Hardware (H), Software (S), and Services (V). An organization can choose all or any combination of the options that apply to it. The registration option determines which of the TL 9000 specific requirements apply to the registration. Once an organization finalizes the registration scope and options, it starts the process of mapping its products to the TL 9000 product categories. TL 9000 has over a hundred product categories covering hardware, software and services.

2. Setting the company up in Quest Forum Portal

The TL 9000 Administrator needs to be contacted to initiate setting the company up in Quest Forum Portal. Any person authenticated by the organization can do the same through the TL 9000 administrator. Once access is gained, using the given Registration ID, a Registration ID Profile is created, and maintained.

3. Data Submission

One of the requirements of the TL 9000 registration is the reporting of the measurements data specific to that registration. The Registration Management System (RMS) facilitates online submission of this data on a monthly basis through its web-based interface in the Quest Forum Portal. Based on the product category, decide which all measurements should be collected against the selected product categories, as per the latest version of the measurement handbook. Start uploading to Quest website against the organization. At least 3 iterations of measurement data are required before starting the final assessment of TL 9000.

4. Assessment Process

Once the process of determining, collecting and analysis of measurement data has been formalized and initiated, and the quality management system has been updated to reflect all the TL 9000 Requirements, the company is ready to go for TL 9000 audit/assessment. The assessor/auditor needs to view the data confirmation reports received from the UTD (University of Dallas, TX) who maintain the RMS. If the assessment is successful, then a TL 9000 Certificate is issued. Information is sent to QuEST, and TL 9000 registration follows. Information is found within the QuEST Forum Portal (which includes access to the RMS) on the QuEST Forum website.

Organizations and companies often go for certifications/assessment like ISO 9001 or ISO 27001 or CMMI. A company may decide to seek certification for many reasons, as certification can:

  • Meet Customer Requirements
  • Result in more revenue and business from new customers
  • Improve Company and Product Quality

Assessment process is a continuous cycle. There are some stages/steps in this continuous cycle leading to certification and sustenance.  For organizations that are new to the implementation process, attaining certification can be a little bit troublesome activity. This article helps to make the implementation stress-free through the ten points explained below. Assessment

1. Determine scope of registration

Determine whether the entire organization or a part of the organization is going for certification. Sometimes only a particular product in the organization is seeking for certification.


2. Get quotes from accredited third-party certifying bodies

The certifying bodies must be accredited to conduct audits. After evaluating several certification bodies (Transition partners in case of CMMI) based on their quotes and many other factors, the best suited certification body is selected by the organization. Once the quote is accepted by both parties – client and certification body, an auditor contacts the client to schedule the assessment audits. It’s vital to clarify and check for other hidden costs such as ‘registration’ and travel fees when obtaining quotes from the certification bodies.

3. Study of standard/model requirements

The first step in any certification/assessment process is ‘to have a clear understanding of the standard/model’. If people are not comfortable with the new standard, perhaps the first step in any implementation could be training on the new standard/model from experts in the industry. If required, organization can opt external consultancy to get help in implementation strategy. A good consultant can increase the value of the process.


4. Gap Analysis

It has to be evaluated how far away is the present management system or the product compliance from the new standard. Gap analysis, Pre-assessment, Internal audits etc. can be used for this evaluation. For more details on gap analysis, please refer Performing gap analysis. The Gap analysis documentation provides the input to the sub sequent phases.

5. Establish an implementation plan

An implementation team, work division, milestones of activities etc. need to be set up. Training has to be to be provided to the implementation team. Implementing the new management system needs to be an organization-wide target developed by senior management. (‘organization’ refers to the entire organization, a part of the organization or a project team as per the scope defined)


6. Ensure the implementation as per plan

The steps include preparation and review of procedures, manuals, other supporting documents, training to the affected parties on the new/changed system and deploying new/changed system.


7. Practice and live with the new system

During this period, observe and evaluate the new/changed system for its effectiveness. Audits need to be conducted to evaluate the changed system. Auditors must be trained to conduct the audits. Existing loop holes, inefficiencies, etc. are corrected and corrective actions are deployed. This leads to continuous improvement of the system. After a few months, the new system and the organization should be ready for the registration audit.


8. Third party Assessment/ certification

The number of auditors needed, and the time involved to conduct a registration audit may vary according to the size and complexity of the organization. Pre-assessments/Stage1 audits are conducted before the final assessment. During the pre-assessments, auditor reviews the existing systems and provides a report identifying further actions required to meet the standard requirements. Once the organization is ready and has fixed the gaps reported in the pre-assessment, the auditor performs the registration/final audit. The final audit is held in accordance with the audit plan. Upon completion of the audit, the auditor generates an audit report identifying non-conformances, if any are there. The client resolves these non-conformances. Once the auditor approves the closure of non-conformances, organization (or client) is recommended for certification. The auditor’s report is then verified via an approval process and if no anomalies are identified, certification is officially granted. Then the auditor works with the client to set up subsequent surveillance audits/health checks to ensure continuous adherence to the standard.


9. Sustaining the standard/model

Attaining a certification is not a one time job. The sustenance of the same is also equally important. So proper attention must be paid to ensure that level of certification is not degraded. To achieve the benefits of improvement from the new/changed system, an organization has to be committed in maintaining and amending the system over time to best suit its requirements. The tough work really starts with the maintenance of the new/changed system. And hence continued buy-in from everyone is important for the implementation to succeed, and for the organization to obtain the true advantages of becoming certified. So proper training needs to be carried out regularly to ensure on-going awareness. In addition, internal audits must be conducted to ensure the compliance to the requirements of the standard/model.


10. Get Buy-In

Getting full support from management and employees is crucial for the success of any certification/assessment program. The company executives need to be well clear on the advantages, requirements and costs etc. It’s also important that the employees are confident on the new system.

13. June 2014 · Write a comment · Categories: CMMI · Tags:

Some natural synergies exist between the generic practices and their related process areas as explained in Evidences supporting implementation of CMMI GPs.

Here the ‘recursive relationships between generic practices and their closely related process areas’ are explained.CMMI GPs-PA

For more information on required evidences for each generic practices, please refer Evidences supporting Implementation of CMMI GPs

13. June 2014 · Write a comment · Categories: CMMI · Tags:

Many process areas address institutionalization by supporting the implementation of the generic practices. An example is the Project Planning process area and GP 2.2. To implement this generic practice we need to implement the Project Planning process area, all or in part. In the below table such related process areas ( which are supporting the GPs) as well as required artefacts( which could be the evidences for the implementation of GPs) are shown.

In addition to this normal GP-PA relationship, there are some recursive relationships between generic practices and their closely related process areas. This is explained in How does the Generic Practice recursively rpply to its related Process Area(s)?

11. June 2014 · 1 comment · Categories: CMMI · Tags:

CMMI was originated by SEI (Software Engineering Institute), sponsored by US Department of Defence. Later on SEI has transferred CMMI-related products and activities to the CMMI Institute, a 100%-controlled subsidiary of Carnegie Innovations, Carnegie Mellon University’s technology commercialization enterprise.

The below pictures illustrates the evolution of CMMI

Evolution of CMMI 1

  • When the era of computerized information systems started in 1960, there was a significant demand for software development. Even though software industry was growing rapidly, many processes for software development were amateur and project failure was common.

More »

Sub processes are components of a larger defined process. For example, a typical development process may be defined in terms of sub processes such as requirements development, design, build, review and test. The sub processes themselves may be further decomposed into other sub processes and process elements. Measurable parameters are defined for these sub processes to analyse the performance of the sub processes. These sub processes are further studied to identify the critical sub processes which are influencing the process performance objectives i.e. PPO. Measurable objectives are set for the critical sub process measures also. PPOs are derived fromBusiness Objectives (BOs).

In the above paragraph, there is a linkage established starting from sub process to BOs. In fact in an organization More »

03. June 2014 · 1 comment · Categories: CMMI, Statistics · Tags:

Baselines are derived statistically using performance data collected over a period of time. They are indicators of current performance of an organization. Hence proper attention must be paid while deriving baselines as an error can cause even a loss of a business. There are some critical, but common mistakes observed in the baselining process as explained below. Crucial steps must be taken to avoid such mistakes.

Pitfall #1: Inapt parameter for baselining.

Organization must plan and define measures that are tangible indicators of process performance. Baselining does not simply imply gathering and baselining the entire set of data available in the organization. Based on the business objectives, the critical processes of the organization whose performance needs to be analyzed is selected. Then process parameters for monitoring the same are defined, collected data and finally baselining done. There is no harm in collecting and baselining the entire parameters defined in the organization, but why should we waste our time collecting data which won’t be used.

Pitfall #2: Not chronological data.

For baselining with control charts, it is essential that the data to be chronological. Hence during data collection itself, time stamp of the data must be noted.

Pitfall #3: Lack of enough number of data points.

In software industry, often we hear complaints from baselining team regarding the deficiency of data points. And when the question is put on project team, they tell like “we just don’t have time” or “it is too difficult”. In order to derive baselines there needs to be a minimum number of data points, say like 10 or so. Then only, at least all the 4 rules of stability can be applied over the data points. But in a software industry people try to build baselines with 8 or less data points. Then it won’t indicate the correct performance level of the process under investigation. In such cases where number of data points is insufficient, baselining needs to be postponed. Or organization can plan to collect more samples by increasing the frequency of data collection.

Pitfall #4: Being inconsistent.

While collecting as well as baselining data, one must use consistent methods and processes. What is being measured in the post baseline data needs to be same as what was measured in the baseline data collection process.

Pitfall #5: Taking non homogeneous data

Data taken for baselining needs to be of homogenous nature. Otherwise the baselining output won’t give the correct indication of process performance. The data can be categorized based on the qualitative parameters like type of project, complexity of the work, nature of development, programming languages etc. instead of clubbing it altogether and thereby leading to loss the homogeneity

Pitfall #6: Absence of data verification.

Usually it is a common mistake to take data blindly from organizational database and start the baselining process. Essentially, data must be verified to ensure its completeness, correctness and consistency before any statistical processing.

Pitfall #7: Not representative sample.

Processes that permits self-selection by respondents aren’t random samples and often aren’t representative of the target population. In order to have a random, representative sample, it has to be ensured that it’s truly random and representative.

Pitfall #8: Basing the baseline value on assumptions, not real data.

People have a tendency to believe that the collected data follows a normal distribution. Sometimes they don’t even check the normality statistically. Another case is like, even after data is found to be non- normal statistically, people try to make it normal by removing some data points. It is logical to remove one or two points out of 15 to 20 points, if there are some assignable reasons. Other than that it is not a good practice, to simply remove the data points in order to make the distribution normal. It is essential to check the actual distribution of the data before going ahead with baselining. Control charts work on a normal data set only. One can check the distribution of the data visually using histograms or so, and can confirm the distribution statistically using some other tools (there are a plenty of excel addins to check the distribution).

Pitfall #9: Ignoring the past Data if there is no process change.

Suppose in an organization yearly baselining is done. In the start of the year 2013 baselines were derived using data points in the previous year, say 2012. Objectives were set to ‘maintain the current process performance’ and no higher targets. And hence no improvement initiatives were triggered to raise the performance level. Next year, data points in the year 2013 were collected for baselining and it was confirmed statistically that both sets of data were equal (data points in 2012 and those in 2013), may the results from a 2 sample T test. Now which data set is taken by the organization for 2014 baselining? It is a common mistake to ignore the 2012 data and do the baselining with 2013 data points alone. Since both sets of data points were similar and statistically equal, both set must be combined in the chronological order while baselining.

Pitfall #10: Blindly taking p value as 0.05

Null hypothesis is rejected if p value is less than a significant level. In the industry, usually the significant level of P is taken as 0.05. Actually P value is an arbitrary value. Higher the p values means; risk attached with it is increasing as we reject a null hypothesis when it was actually true. (Refer more details of p value in the blog hypothesis test ) And it is up to the organization to decide that significant level.

Pitfall #11: Removing out of turn points when there is no assignable causes

Out of turn points cannot be removed if there are no assignable reasons behind it. If there is no reason for an out of turn point, it implies that data is not stable and one cannot go ahead with baselining.

Pitfall #12: Placing unfeasible values as control limits

Sometimes the control limits derived statistically during baselining process may be unworkable. Say for example a baseline of review effectiveness data (in %) cannot have an upper control limit (UCL) as 120% even though statistically it is correct. Similarly a coding speed baseline cannot have a lower control limits (LCL) as -15 lines of code/hr. All such values are unusable. So an organization needs to have a policy to handle such situations. Say for example, an organization can use 25th and 75th percentiles of the stable data as control limits in such scenario. Or organization can decide to change the LCL/UCL to the minimum/maximum permissible value of that parameter. i.e. organization can change the LCL of coding speed as ‘zero’ instead of a negative value and UCL of review effectiveness as 100% in the above examples.

Pitfall #13: Stating the baseline without contextual information

Stating the context description involves a consistent understanding of the result of the measurement process. Contextual information refers to the additional data related to the environment in which a process is executed. As a part of contextual information, timestamp, context, measurement units etc. are collected.

Pitfall #14: Inapt communication mode.

Nowadays, our computer software supports a wide range of graphs. And people try to use those graphs altogether and finally making real stuffs hidden or complex. One must select the right graph to communicate the processed data. Run charts, pie charts, control charts and bar charts are all good means of communication, but the best fit must be chosen.

Pitfall #15: Not beginning with the end in mind.

One must determine in advance how the processed data is going to be used. This helps to make good choices in what data to be collected (never waste time collecting data which won’t be used), what tool to be used. Also one must plan to measure everything needed to know how the effect of the change is going to be calculated. It is usually too late to go back and correct things if something is left out.