Deep Stalling using a Coverage Driven Genetic Algorithm Framework
暂无分享,去创建一个
In recent times, stalling data-paths and virtual channels is becoming increasingly more important for uncovering corner case timing critical bugs in complex hardware design. Generating stimulus to achieve congestion at various virtual channels across numerous interfaces is a hard problem. Hence verification engineers typically mimic such congestion by artificially stalling the interfaces. Artificial stalling often exposes deadlock bugs, violations in ordering rules and credit overflow/underflow issues which escape traditional random simulation. To this end, random First-In First-Out (FIFO) and pipe stalling has played a critical role in Nvidia’s simulation environment. While random stalling is useful, it is still not good enough to hit certain coverage metrics in the ever-decreasing time window for verification closure. In this paper, we demonstrate the application of a Genetic Algorithm (GA) framework to improve the efficacy of random stalling as measured by a set of coverage metrics. The GA based framework learns from existing stalling regressions and makes intelligent decisions regarding the Stall Parameters (SP) for achieving better coverage. Finally, we introduce DeepStall - an accelerated GA framework which uses a Deep Learning based model (of the relation function of SP and coverage metric) to tune SPs for better coverage.