Analytical Quality Ranking of Equipment under Procurement: An Improvement of Contemporary Practice

  • Mahendra Prasad
    Lt. Col. Mahendra Prasad is Research Fellow was the Institute for Defence Studies and Analyses, New Delhi. read more

One of the sore points in the Quality Assurance (QA) of equipment being procured by the Indian Army is that in quite a few cases the equipment that is cleared during the user field trials is rejected by the Directorate General of Quality Assurance (DGQA) during technical and environmental evaluation. A case in point is the procurement of the Truck Mounted Lifting Device (TMLD). This acquisition case got unduly delayed because of variation in the perception of physical parameters enumerated in the General Staff Qualitative Requirements (GSQR) of the equipment between the user’s trial team and the DGQA team that carried out its field trials and technical evaluation. Later on, a number of collegiate meetings were held, just to clarify the correct interpretation of the parameters listed in the GSQR and a limited re-trial of the equipment was ordered. The main cause of this imbroglio was that certain parameters were not very clearly and objectively spelt out in the GSQR and therefore left room for varied interpretations. There may be many other cases in which the technical testing results are at variance with those of user trials. It is pertinent here to mention that re-trial, as in the case of TMLD, not only causes avoidable delay in acquisitions but also leads to discouragement of vendors as it imposes an additional financial burden on them because the complete trial evaluation is at their expense as defined in the Defence Procurement Procedure (DPP).

What can be done to obviate the problem?

Ensuring that no ambiguity is left in measurable and tangible parameters while formulating the GSQR shall not only obviate such embarrassing delays in procurement but facilitate timely and accurate trial evaluation as well. For this an expert establishment for GSQR formulation, on the lines of Request for Proposal (RFP) cell, is the need of the hour. This may, however, take some time due to the financial sanction required for raising a new establishment and is fraught with the apprehension of bureaucratic resistance and delays.

In the interim it would be prudent to dynamically modify the trial evaluation process to minimize such delays. A suggested modification is that the parameters, dimensions and operational requirements specified by the user must be evaluated exclusively by the user trial team, while DGQA must concentrate only on the testing of quality encompassing the product design, the material used and the manufacturing process in addition to the environmental testing of the product under simulated conditions. In other words, the domain of user trial and DGQA evaluation must be mutually exclusive. Further, it is suggested that DGQA must be offered to evaluate samples of only those vendors which are cleared by the user trial team and must, therefore, always succeed the user’s trial. In case there is a pressing requirement of conducting user and DGQA trials concurrently to save time, the trial methodology must explicitly spell out the parameters to be evaluated by the user and DGQA, thereby eliminating the chance of a conflicting report at a later stage.

Role of DGQA

What DGQA can do, in turn, is to carry out an analytical assessment of the quality of the equipment it inspects and give a quality ranking to various alternatives of the equipment sample submitted by the qualified vendors. ‘Qualified vendors’ here means only those vendors whose product have not only been cleared in the paper Technical Evaluation but have also been cleared during the user field trials. This would considerably reduce the workload of the DGQA as it would be evaluating a lesser number of samples. It would also prevent rejections at the Quality Assurance stage and would, instead, provide a quality ranking of various samples or alternatives of the product, supported by a strong scientific and analytical method.

Suggested Methodology

Though a number of such scientific tools are available for such an analysis, the most potent and time tested among them is the Analytic Hierarchy Process (AHP). This process was developed by Dr. Thomas L. Saaty in the early 1970s and is in extensive use as a decision support tool for numerous corporate and Government decisions since then. The steps involved for quality ranking using AHP are as under:-

  • Decompose the ranking problem into a hierarchy of criteria and sub-criteria and pick alternatives available in the form of vendor samples. For example, the criteria could be the quality of material used, design of product and the manufacturing process.
  • Use expert judgment to determine the ranking of criteria, e.g. material quality could be twice as important as the manufacturing process for a particular product.
  • Express the relative importance of one criterion over another using pair wise comparison. Put the result in a mathematical matrix form and square it. Calculate the rows sums and normalize them using matrix algebra. Continue this iteration till the time the results of two successive iterations do not change. The final column matrix called the Eigenvector gives the local weights of each criterion.
  • In a similar manner, obtain the local weight of each alternative for each criterion, e.g. alternative 1 may have the material strength twice as much as alternative 2.
  • Finally, we shall have two matrices: one for alternatives’ weights having number of rows equal to number of alternatives and number of columns equal to number of criteria; and the second matrix shall be a column matrix with number of rows equal to the number of criteria. Multiplication of both these matrices yields the final weight of each alternative and thus the relative quality ranking of each alternative is obtained.

The greatest advantage of this method is its simplicity and the requirement of basic knowledge of Matrix Algebra. While a number of softwares are available for solving such problems, nonetheless a customized programme using object oriented programming language (C++) can also be written.

Conclusion

Quality ranking of samples of equipment submitted by vendors during technical and environmental evaluation shall have three major benefits, viz. no equipment will be rejected on quality aspects, General Staff (GS) would be empowered to select or reject the best equipment in their GS evaluation based on the reports of the user trial, quality ranking and the maintainability trials and will have to apply themselves intelligently for arriving at a decision in selection of equipment rather than just collating the information. Last but not the least, vendors will not have to incur any additional expenditure on re-trial as there shall not be any scope for variance in opinion between the user trial report and QA report since both events would be mutually exclusive. The biggest benefit will be in terms of time saved in trials by eliminating re-trials and avoidance of collegiate discussion on points on which the user and DGQA are at variance in their perception.

Keywords: Defence Procurement