An error-oriented test methodology to improve yield with error-tolerance

The main objective of error-tolerance is to increase the effective yield of a process by identifying defective but acceptable chips. In this paper, we propose an error-oriented test methodology to support error-tolerance in scan-based digital circuits. Error-rates of defective chips are first estimated and then compared with application-specific acceptable values of error-rates to determine the suitability of each chip. A theoretical basis to estimate error-rates of chips with a specified degree of confidence is presented. We determine an appropriate upper bound on the number of test patterns needed to satisfy a given estimation accuracy. To find out the yield improvement of the proposed test methodology, we present a method to determine the error-rate distribution of defective chips, and thus predict the fraction of defective chips that are acceptable. The proposed test methodology can support product grading, i.e., chips can be classified based on their actual error-rates such that best pricing for products used in different applications can be determined. Experimental results show that the proposed method accurately estimates error-rates of faulty chips, and the estimation results can be applied to increase the effective yield of a VLSI part as a function of various values of acceptable error-rate