A global conformance quality model. A new strategic tool for minimizing defects caused by variation, error, and complexity

The performance of Japanese products in the marketplace points to the dominant role of quality in product competition. Our focus is motivated by the tremendous pressure to improve conformance quality by reducing defects to previously unimaginable limits in the range of 1 to 10 parts per million. Toward this end, we have developed a new model of conformance quality that addresses each of the three principle defect sources: (1) Variation, (2) Human Error, and (3) Complexity. Although the role of variation in conformance quality is well documented, errors occur so infrequently that their significance is not well known. We have shown that statistical methods are not useful in characterizing and controlling errors, the most common source of defects. Excessive complexity is also a root source of defects, since it increases errors and variation defects. A missing link in the defining a global model has been the lack of a sound correlation between complexity and defects. We have used Design for Assembly (DFA) methods to quantify assembly complexity and have shown that assembly times can be described in terms of the Pareto distribution in a clear exception to the Central Limit Theorem. Within individual companies we have found defects to bemore » highly correlated with DFA measures of complexity in broad studies covering tens of millions of assembly operations. Applying the global concepts, we predicted that Motorola`s Six Sigma method would only reduce defects by roughly a factor of two rather than orders of magnitude, a prediction confirmed by Motorola`s data. We have also shown that the potential defects rates of product concepts can be compared in the earliest stages of development. The global Conformance Quality Model has demonstrated that the best strategy for improvement depends upon the quality control strengths and weaknesses.« less

[1]  W. E. Franklin Working with the Japanese , 1996 .

[2]  Samuel Kotz,et al.  Continuous univariate distributions : distributions in statistics , 1970 .

[3]  Susan L. Albin The lognormal distribution for modeling quality data when the mean is near zero , 1990 .

[4]  Richard L. Scheaffer,et al.  Probability and statistics for engineers , 1986 .

[5]  P. Adler Time-and-motion regained , 1993 .

[6]  Ron S. Kenett Two methods for comparing Pareto charts , 1991 .

[7]  M. O. Lorenz,et al.  Methods of Measuring the Concentration of Wealth , 1905, Publications of the American Statistical Association.

[8]  M. Malek-Zavarei,et al.  Engineering reliability, new techniques and applications , 1982, Proceedings of the IEEE.

[9]  Karl T. Ulrich,et al.  A Framework for Including the Value of Time in Design-for-manufacturing Decision Making , 1993 .

[10]  R. Schaffer,et al.  Successful change programs begin with results. , 1992, Harvard business review.

[11]  D. Harris,et al.  Effect of equipment complexity on inspection performance. , 1966, The Journal of applied psychology.

[12]  Jay M. Berger,et al.  A New Model for Error Clustering in Telephone Circuits , 1963, IBM J. Res. Dev..

[13]  George Kingsley Zipf,et al.  National unity and disunity : the nation as a bio-social organism , 1941 .

[14]  Balbir S. Dhillon,et al.  Human Reliability: With Human Factors , 1986 .

[15]  P. Fitts The information capacity of the human motor system in controlling the amplitude of movement. , 1954, Journal of experimental psychology.

[16]  D. Garvin Competing on the Eight Dimensions of Quality , 1987 .

[17]  Kyung S. Park Human Reliability: Analysis, Prediction, and Prevention of Human Errors , 1986 .

[18]  Susanne M. Gatchell The Effect of Part Proliferation on Assembly Line Operators' Decision Making Capabilities , 1979 .