Analysis and Simulation of a Low Leakage Conventional SRAM Memory Cell at Deep Sub-micron Level

The High leakage current in deep sub-micrometer region is becoming a significant contributor to power dissipation in CMOS circuits as threshold voltage, channel length, and the gate oxide thickness are reduced. As the standby current in memories is critical in low-power design. By lowering the supply voltage (VDD) to its standby limit, the data retention voltage (DRV), SRAM leakage power can be reduced substantially. The DRV increases with transistor mismatches. In this paper, we demonstrated the drowsy cache technique which shows a decrease in the leakage current dissipation in deep sub-micron designs of memory cells and embedded memories. The focus of this work is to simulate an effective scheme for SRAM leakage suppression in battery-powered mobile applications.