How can we optimize the time and effort of a full time QA person (process consultant) in a project..? In majority of the cases, at least 50 % of the auditing work done by QA could be automated.. Then how could the QA work be made cost effective..?

A traffic police controls the traffic in a road. Now we could see traffic signals in majority of the places, which too automated system regulating traffic according to the situation. Is there really a need of a full time traffic police..? Of course people intervention is required when there is some critical issues which cannot be rectified by machines/when machines are down. Other than that, is it not a waste of effort/money to employ a man full time to regulate the traffic? May, be at peak hours, at some busy junctions, can have man also in addition to machines. Or rather, if still a full time person is required, his time should be used more fruitfully, not merely limited to traffic regulations, say like helping high way police, supporting as a travel Guide to whoever required etc , etc .

Similarly, how can we optimize the time and effort of a full time QA person (process consultant) in a project..? In majority of the cases, atleast 50 % of the auditing work done by QA could be automated, and then it is a matter of proper interpretation/analysis of the findings. So wherever possible routine stuffs should be automated and QA person needs to put his/her eyes mostly on taking preventive measures/risk identification. Prevention is always better than detection. Prevention could be triggered through day to day project level activities, data examination, statistical analysis etc. Now, what more action could be taken so that full time QA persons can be cost beneficial to the project/organization? Or what do you think about the presence a full time QA person in a project?

 

What could be role of models/standards in coming days…?

Nowadays many of the big organizations set up their processes and procedures without formal compliance to any models/standards like CMMI or ISO. It doesn’t mean that they are not adhering to any of the standard practices. Definitely without ‘process adherence/quality products’, they cannot be so called ‘big organizations’. In such cases, process may or may not be compliant to those standards, but free from certification loop. In some instances, there are business requirements for the organizations to be certified/complied to specific standards. Otherwise, ‘can an organization sustain without these standards/models implemented..?’ If the number of organizations, who set up their own process, is increasing, thinking forward, what could be role of such models/standards in coming days..?

 

 

Process composition, Process performance models etc. are terminologies in CMMI High Maturity area. Process needs to be composed quantitatively considering various alternatives to achieve the project performance objective. There needs to be a linkage between composed process and PPM. Or rather there needs to be alternative methods for the critical sub processes in the PPM. It means there needs to be a number of PPMs too. i.e. If customer feedback score is the objective and number of acceptance testing bugs, code review defect density, schedule variance in requirement analysis etc are the subprocesses in PPM , then to compose the process of code review or requirement analysis, there needs to be various alternatives. Say for example code review could be done by peer review or expert review, like that. Similarly for other subprocesses. Then there will be different PPMs built with different alternatives in an organization unless otherwise the data from all alternatives altogether forms a stable process.. So is there a chance to have PPM in single, since data pertaining to each alternative would be different?

 

PPMs are valid within the data used or building them. We cannot extrapolate the data for prediction. Now we use these PPMs to check the probability of confidence in achieving the targets. That is, we are simulating subprocess parameters which could be outside the valid range of PPM. In that case, how can we expect the PPM to give a convincing result..?

Let me illustrate it with an example .

I have made a productivity PPM, such as Xs are coding speed, expertise index, requirements stability index and Y is the productivity. The data comes from current performance baselines of the organization (stable process). In the PPM, adjusted R-sq factor, VIF, individual correlation factor, everything as required itself. While performed simulation to check, the probability ‘to achieve above the mean of productivity PPB’, >50% observed.

Organizational PPO is to improve productivity from 30 LOC/PD to 40 LOC/PD though some improvements identified in coding process so that increase in coding speed will lead to increase in productivity. Suppose current PPB of coding speed is 25 to 45 LOC/Hr and the increased expected range is 35 to 55 LOC/Hr (proven through some piloting/validation). Now project A want to use this PPM for productivity. Here actually the subprocess performance data is outside the stable limits/outside the valid range of PPM. In such a scenario, the project team cannot use the previous PPM at all. So what is the use of organizational PPM in that case..?

To summarize, if PPO is targeted higher to current PPB and to achieve the same, organization comes up with improvement initiatives with an improved rage on subprocess limits (outside to the valid range), then definitely, the previously defined PPM cannot be used. Or rather in such scenario, organization should come up with a calibrated ppm too..

For Productivity PPM, Code Review Effectiveness (CRE) is a sub process Parameter. CRE, itself is a dependent parameter and hence a sub PPM is built with CRE as the predictor. So while using Productivity PPM, the process shall be composed with CRE PPM first and then the expected range of CRE based on the simulated values shall be put in Productivity PPM.

 

 

 

 

 

 

 

 

 

A PM  says to his senior manager, ” There is only a 50- 50 chance to deliver the project on time”. A reply like ‘”you take the risk and go ahead as it is ” would be a rare case especially in case of a project where on time delivery is a critical requirement. Back to the question from where we started, is this probability okay to proceed..?

Similarly, normally improvement initiatives triggered by the organization would lead  to achieve the specification limits (range) of the PPO with a probability of 99.7 % ( or roughly>90 %, considering data variability in Prediction models) ie.to achieve above/below to central value of PPO, probability would be < 50 %. Now if the projects are targeting on the positive side (above to mean in case  ‘ productivity’ is a PPO ), ultimately organization objectives could be attained.  Otherwise reverse situation will happen. So project team would be targeting to achieve above to central value, in normal case. This means that probability of success is only 50 % even with organizational improvement initiatives.

Back to the question from where we started, is this probability okay to proceed..?
Certainly not, in the eyes of senior management.  if a project is targeting for mean as PPO, then probably for success is 50 % only. So either there needs to be some other improvement initiatives inside the projects other than those triggered by the organization or  the 50-50 risk of not meeting the target needs to be accepted by the senior management of the organization.
If additional improvement initiatives are triggered to make probability of success greater, definitely, the current mean would be changed, and in such a scenario project team needs to deduce the internal estimates with current mean.( see more on setting internal estimates)

Now there could be another case like, some of the projects are targeting below to mean due to certain genuine reasons. Then probability of success would be higher. But if the case is repeated among many projects, ultimately organizational objective wont be attained. So there needs to have mechanism to ensure that if any of the projects are targeting below to mean of PPO, there must have another project with PPO above to mean. Normally Engineering process group (EPG) needs to take care of this.