Breaking the simulation barrier: SRAM evaluation through norm minimization

With process variation becoming a growing concern in deep submicron technologies, the ability to efficiently obtain an accurate estimate of failure probability of SRAM components is becoming a central issue. In this paper we present a general methodology for a fast and accurate evaluation of the failure probability of memory designs. The proposed statistical method, which we call importance sampling through norm minimization principle, reduces the variance of the estimator to produce quick estimates. It builds upon the importance sampling, while using a novel norm minimization principle inspired by the classical theory of Large Deviations. Our method can be applied for a wide class of problems, and our illustrative examples are the data retention voltage and the read/write failure tradeoff for 6T SRAM in 32 nm technology. The method yields computational savings on the order of 10000x over the standard Monte Carlo approach in the context of failure probability estimation for SRAM considered in this paper.

[1]  David Blaauw,et al.  Statistical Analysis and Optimization for VLSI: Timing and Power , 2005, Series on Integrated Circuits and Systems.

[2]  Steven A. Edwards,et al.  The Nanotech Pioneers , 2006 .

[3]  K. Keutzer,et al.  A general probabilistic framework for worst case timing analysis , 2002, Proceedings 2002 Design Automation Conference (IEEE Cat. No.02CH37324).

[4]  Ken Smits,et al.  Penryn: 45-nm next generation Intel® core™ 2 processor , 2007, 2007 IEEE Asian Solid-State Circuits Conference.

[5]  Sharad Malik,et al.  Timing Analysis of Combinational Logic Circuits , 1993 .

[6]  G. Parmigiani Large Deviation Techniques in Decision, Simulation and Estimation , 1992 .

[7]  M. Ieong,et al.  Monte Carlo modeling of threshold variation due to dopant fluctuations , 1999, 1999 Symposium on VLSI Circuits. Digest of Papers (IEEE Cat. No.99CH36326).

[8]  Sani R. Nassif,et al.  Statistical analysis of SRAM cell stability , 2006, 2006 43rd ACM/IEEE Design Automation Conference.

[9]  S. Ramesh,et al.  Design and use of memory-specific test structures to ensure SRAM yield and manufacturability , 2003, Fourth International Symposium on Quality Electronic Design, 2003. Proceedings..

[10]  Kaushik Roy,et al.  Statistical design and optimization of SRAM cell for yield enhancement , 2004, ICCAD 2004.

[11]  Rajiv V. Joshi,et al.  Mixture importance sampling and its application to the analysis of SRAM designs in the presence of rare failure events , 2006, 2006 43rd ACM/IEEE Design Automation Conference.

[12]  Rob A. Rutenbar,et al.  Recursive Statistical Blockade: An Enhanced Technique for Rare Event Simulation with Application to SRAM Circuit Design , 2008, 21st International Conference on VLSI Design (VLSID 2008).

[13]  David Blaauw,et al.  AU: Timing Analysis Under Uncertainty , 2003, ICCAD.

[14]  Yu Cao,et al.  New generation of predictive technology model for sub-45nm design exploration , 2006, 7th International Symposium on Quality Electronic Design (ISQED'06).

[15]  A.P. Chandrakasan,et al.  Static noise margin variation for sub-threshold SRAM in 65-nm CMOS , 2006, IEEE Journal of Solid-State Circuits.

[16]  Rob A. Rutenbar,et al.  Statistical Blockade: A Novel Method for Very Fast Monte Carlo Simulation of Rare Circuit Events, and its Application , 2007, 2007 Design, Automation & Test in Europe Conference & Exhibition.

[17]  Ping Wang,et al.  Variability in sub-100nm SRAM designs , 2004, IEEE/ACM International Conference on Computer Aided Design, 2004. ICCAD-2004..

[18]  R. Srinivasan Importance Sampling: Applications in Communications and Detection , 2010 .

[19]  Trevor Mudge,et al.  Yield-driven near-threshold SRAM design , 2007, ICCAD 2007.

[20]  Naveen Verma,et al.  A 65nm 8T Sub-Vt SRAM Employing Sense-Amplifier Redundancy , 2007, 2007 IEEE International Solid-State Circuits Conference. Digest of Technical Papers.