Skip to main content
eScholarship
Open Access Publications from the University of California

Markov Chain Monte Carlo Stimulus Generation for Constrained Random Simulation

  • Author(s): Kitchen, Nathan
  • Advisor(s): Kuehlmann, Andreas
  • et al.
Abstract

As integrated circuits have grown in size and complexity, the time required for functional verification has become the largest part of total design time. The main workhorse in state-of-the-art practical verification is constrained random simulation. In this approach, a randomized solver generates solutions to declaratively specified input constraints, and the solutions are applied as stimuli to a logic simulator. The efficiency of the overall verification process depends critically on the speed of the solver and the distribution of the generated solutions. Previous methods for stimulus generation achieve speed at the expense of quality of distribution or rely on techniques that do not scale well to large designs.

In this dissertation, we propose a new method for stimulus generation based on Markov chain Monte Carlo (MCMC) methods. We describe the basic principles of MCMC methods and one of the most common of these methods, Metropolis-Hastings sampling. We present our approach, which combines the Metropolis-Hastings algorithm with stochastic local search. We show with experimental results that it surpasses existing stimulus-generation methods in speed, robustness, and quality of distribution.

After presenting our basic algorithm, we describe several refinements and variations of it. These refinements include the addition of control variables to handle dependencies on external data and elimination of variables to increase efficiency and distribution. In addition, we present a parallel version of our algorithm and give theoretical analysis and experimental evidence of the speedup achieved by parallelization.

Main Content
Current View