|Title:||Annealing and Tempering for Sampling and Counting|
|Advisors:||Drs. Dana Randal and Eric Vigoda|
|Committee:||Dr. Dana Randall, College of Computing|
|Dr. Eric Vigoda, College of Computing|
|Dr. Prasad Tetali, School of Mathematics|
|Dr. Robin Thomas, School of Mathematics|
|Dr. Santosh Vempala, College of Computing|
|Reader:||Dr. Prasad Tetali, School of Mathematics|
The Markov Chain Monte Carlo (MCMC) method has been widely used in practice since the 1950's in areas such as biology, statistics, and physics. However, it is only in the last few decades that powerful techniques for obtaining rigorous performance guarantees with respect to the running time have been developed. Today, with only a few notable exceptions, most algorithms for approximate sampling and counting rely on the MCMC method. This thesis focuses on algorithms that use MCMC combined with an algorithm from optimization called simulated annealing, for sampling and counting problems.
Annealing is a heuristic for finding the global optimum of a function over a large search space. It has recently emerged as a powerful technique used in conjunction with the MCMC method for sampling problems, for example in the estimation of the permanent and in algorithms for computing the volume of a convex body. We make progress towards understanding when annealing can be applied to sampling problems as well as scenarios when it fails to converge in polynomial time.
We consider the problem of randomly generating 0-1 contingency tables. This is a well-studied problem in statistics, as well as the theory of random graphs, since it is also equivalent to generating a random bipartite graph with a prescribed degree sequence. Previously, the only algorithm known for all degree sequences was by reduction to approximating the permanent of a 0-1 matrix. We give a direct and more efficient combinatorial algorithm which relies on simulated annealing. An interesting aspect of the annealing algorithm is that the high temperature distribution for the annealing is defined algorithmically.
Simulated tempering is a variant of annealing where the state space of the Markov chain is extended to a polynomial number of progressively smoother distributions, parametrized by temperature. During the simulation, the temperature is allowed to vary randomly. We show that simulated tempering takes exponential time to converge for sampling from configurations of the 3-state ferromagnetic Potts model on the complete graph. Furthermore, the rate of convergence is slower than the algorithm at a fixed temperature by at least an exponential factor, disproving the conventional belief that in the worst case tempering could slow down the fixed temperature algorithm by at most a polynomial factor.