10.1184/R1/7448096.v1
Shupeng Sun
Fast Statistical Analysis of Rare Failure Events for SRAM Circuits in High-Dimensional Variation Space
2018
Carnegie Mellon University
Fast Statistical Analysis
SRAM Circuits
2018-12-13 22:18:48
article
https://kilthub.figshare.com/articles/Fast_Statistical_Analysis_of_Rare_Failure_Events_for_SRAM_Circuits_in_High-Dimensional_Variation_Space/7448096
SRAM (static random-access memory) has been widely embedded in a large amount of semiconductor chips. Therefore, the yield of most semiconductor chips is dominated by the yield of SRAM. SRAM consists of a considerable number of replicated components (e.g., SRAM bit-cell, SRAM array, sense amplifier, etc.), and the failure event for each individual component must be rare in order to achieve sufficiently high yield. Accurately estimating the rare failure rates for these replicated circuit components is a challenging task, especially when the variation space is high-dimensional. In this thesis, three novel approaches have been proposed to efficiently estimate the rare failure probability in the SRAM circuits.<br>First, we propose a subset simulation (SUS) technique to estimate the rare failure rates for circuit blocks which have continuous performance metrics. The key idea of SUS is to express the rare failure probability of a given circuit as the product of several large conditional probabilities by introducing a number of intermediate failure events. These conditional probabilities can be efficiently estimated with a set of Markov chain Monte Carlo samples generated by a modified Metropolis algorithm. To quantitatively assess the accuracy of SUS, a statistical methodology is further proposed to accurately estimate the confidence interval of SUS based on the theory of Markov chain Monte Carlo simulation.<br>Second, to efficiently estimate the rare failure rates for circuit blocks which have discrete performance metrics, scaled-sigma sampling (SSS) is proposed. SSS aims to generate random samples from a distorted probability distribution for which the standard deviation (i.e., sigma) is scaled up. Next, the failure rate is accurately estimated from these scaled random samples by using an analytical model derived from the theorem of “soft maximum”.<br>Finally, to further reduce the simulation cost, we propose a Bayesian scaled-sigma sampling (BSSS) approach which can be considered as an extension of SSS. The key idea of BSSS is to explore the “similarity” between different SSS models fitted at different design stages and encode it as our prior knowledge. Bayesian model fusion is then adopted to fit the SSS model with consideration of the prior knowledge.