Randomized Algorithms

Randomized Algorithms

Rajeev Motwani, Prabhakar Raghavan

Language: English

Pages: 496

ISBN: 0521474655

Format: PDF / Kindle (mobi) / ePub


For many applications, a randomized algorithm is either the simplest or the fastest algorithm available, and sometimes both. This book introduces the basic concepts in the design and analysis of randomized algorithms. The first part of the text presents basic tools such as probability theory and probabilistic analysis that are frequently used in algorithmic applications. Algorithmic examples are also given to illustrate the use of each tool in a concrete setting. In the second part of the book, each chapter focuses on an important area to which randomized algorithms can be applied, providing a comprehensive and representative selection of the algorithms that might be used in each of these areas. Although written primarily as a text for advanced undergraduates and graduate students, this book should also prove invaluable as a reference for professionals and researchers.

 

 

 

 

 

 

 

 

 

 

THE CHERNOFF BOUND • Definition 4.2: F (ft, 8) = exp It is immediate that F~(fi,8) is always less than 1 for positive /x and 8. Note two differences between the proofs of Theorems 4.1 and 4.2. First, we directly apply the basic Chernoff technique to the random variable —X rather than apply Theorem 4.1 to Y = n — X (a plausible option, which leads, however, to a slightly weaker bound than the one derived below). Second, the form of the McLaurin expansion for ln(l — 8) allows us to obtain a

randomized polynomial-time algorithm that works with high probability. In yet others, we have a deterministic or randomized algorithm, but one that is non-uniform. And finally, we have instances where we know of no efficient algorithm for finding the object in question. 5.2. Maximum Satisfiability We turn to the satisfiability problem defined in Section 1.5.2: given a set of m clauses in conjunctive normal form over n variables, decide whether there is a truth assignment for the n variables that

this theme for the remainder of this section. Given an instance /, let m*{I) be the maximum number of clauses that can be satisfied, and let mA(I) be the number of clauses satisfied by an algorithm A. The performance ratio of an algorithm A is defined to be the infimum (over all instances /) of mA(I)/m*(I). If A achieves a performance ratio of a, we call it an ^-approximation algorithm. For a randomized algorithm A, the quantity mA{I) may be a random variable, in which case we replace mA(I) by

where Pr[£ 1 | £2] denotes the conditional probability of £\ given £2. Sometimes, when a collection of events is not independent, a convenient method for computing the probability of their intersection is to use the following generalization of (1.5). Pi-MU^] = Vt[£i\ x ?r[£2 I Si] x Pr[£3 1 Sx n £2] • • • Pr[£* 1 rl£}Si]. Consider a graph-theoretic example. Let G be a connected, undirected multigraph with n vertices. A multigraph may contain multiple edges between any pair of vertices. A cut in G

\\xB~W\\ = \\xW\\ < IMI/10. 154 6.8 PROBABILITY AMPLIFICATION BY RANDOM WALKS ON EXPANDERS A similar inequality can be obtained for y as follows. Observe that yB = Y!i=2cieiB = Y!i=ici^iei- ft i s a l s o c l e a r t h a t I L F ^ I I < \\yB\\ since W is only zeroing out some entries in the vector yB. Since X^ < 1/10 corresponds to the second largest eigenvalue, N \\yBW\\ < \ '\ i=2 i=2 Using the triangle inequality, we obtain that \\pBW\\ < \\xBW\\ + \\yBW\\ < ^ (||x|| + Finally,

Download sample

Download