RANDOMIZED ROUNDING
RANDOMIZED ROUNDING
RAJEEV MOTWANI AND PRABHAKAR RAGHAVAN
PRESENTER : PRASANNA GANESAN
Extracted from
Randomized Algorithms ,
By Rajeev Motwani and Prabhakar Raghavan
Randomized Rounding
- This is a probabilistic method to convert a solution of a relaxed linear problem into an approximate solution to the original problem
- The basic idea is to interpret the fractional solutions provided by the linear program as probabilities for rounding them.
- This is used when a linear program is NP-hard.
A language L is NP-hard if, for all L’ [pic] NP , there is a polynomial reduction L’ to L.
It actually means that if you apply a polynomial function over any of the language L[pic] NP then it results in another L’ [pic] NP.
For such problems we need to relax some of the conditions in linear programming which demands the application of randomized rounding over the solution sets.
We shall see a simple example of this in detail.
WIRING PROBLEM
Assume that this is a grid of logic gates, which are connected by four wires, and the gates are numbered from 1 to n. The wires are assumed to run over the boundaries.
Our problem is to connect the set of nets connected together.
A net may be connect 2 or more gates.
As a simple example we consider a simple net which connects just 2 gates.
Xi0 - When a net goes from horizontal to vertical direction and
Xi1 - When a net goes from vertical and then to Horizontal direction
Where i denotes the net or the wire in our example.
For example ,
X10 = 0 and X11 = 1 from the above fig.
Tb0 = { i | net i passes thro b if Xi0 = 1 }
Tb1 = { i | net i passes thro b if Xi1 = 1 }
They represent a set of the total no of Xi0 and Xi1 respectively.
ws(b) = No of wires that pass thro b in a solution S
ws = maxb ws(b) [the max no of wires thro any boundary
Our integer problem can be expressed as,
Minimize w which is the total no of wires used.
Where Xi0 , Xi1 [pic] {0,1} ([pic] nets i )
Subject to
Xi0 + Xi1 = 1 ([pic] nets i )
[pic]Xi0 + [pic]Xi1 [pic] w ([pic] boundaries b )
Let W0 be the value of w in the above equations
To solve this,
We do linear program relaxation, which is
by replacing
[pic]Xi0 , Xi1 [pic] [0,1] ([pic] nets i [pic]).
i.e, they assume real values instead of integer values.
By doing so we can obtain a solution for the linear program, since the original program is NP-hard.
Let [pic]i,0 , [pic]i,1 [pic] [0,1] ([pic] nets i [pic]) be the solutions
and let [pic] the value of the objective function.
We note here that [pic]< w0 because of our assumption that [pic] takes real values between 0 and 1.
[pic]
Now we have to round the solutions,
For each i ,
set [pic]i0 to 1 and [pic]i1 to 0 with probability [pic]i,0
Otherwise set [pic]i0 to 0 and [pic]i1 to 1
Thus for each i,
Pr [[pic]i0 = 1] = [pic]i0
Pr [[pic]i1 = 1] = [pic]i1
Note : The value closer to 0(or 1) is likely set to be 0(or 1).
Theorem :
Let [pic]be a real number such that 0 < [pic]< 1. Then with probability 1- [pic], the global wiring S produced by randomized rounding satisfies
Ws[pic][pic] [pic](1+[pic]( [pic],[pic]/2n)) [pic] W0 (1+[pic]( W0, [pic]/2n))
Proof:
To prove this lets have a brief introduction on chernoff bounds and poisson trials.
Poisson Trials
Unlike fixed probability in bernoullis trials,we allow every Xi to have a different probability, Pr[Xi = 1] = pi,and
Pr[Xi = 0] =(1-pi), then these event are called Poisson trials. A Poisson trial by itself is really just a Bernoulli trial. But when you have a lot of them together with different probabilities, they
are called Poisson trials. But it is very important that the Xi must still be independent.
Chernoff bounds are a kind of tail bound. Like Markoff and Chebyshev, they bound the total amount of probability of some random variable Y that is in the “tail”, i.e. far from the mean.
Let X1,X2,…..Xn be independent poisson trials with
Pr[XI=1]=pi, then if X is the sum of Xi, and if [pic] is E[x], for any
[pic] [pic](0,1]:
Pr[X> (1+[pic])[pic]] < [e[pic]/((1+[pic])(1+[pic]))][pic] -------(1)
Here [pic] is deviation given by [pic]= [pic]([pic], [pic]) suffices to keep Pr[X> (1+[pic])[pic]] below [pic] irrespective of the values of n and pis.
Now lets prove the theorem.
Consider a boundary b,
Since the solutions satisfy the constraints, we can say that
[pic][pic]i0 + [pic][pic]i1 [pic] [pic]
The no of wires passing through b is
Ws(b) = [pic][pic]i0 + [pic][pic]i1
Since they r independent poisson trials
E[Ws(b)] = [pic]E[[pic]i0] + [pic]E[[pic]i1] = [pic][pic]i0 +[pic][pic]i1[pic][pic]
By the result from Chernoff bound given in equation (1),
Pr[Ws(b)[pic][pic] [pic](1+[pic]( [pic],[pic]/2n)) ] [pic] [pic]/2n
Thus for any particular boundary the probability is atmost [pic]/2n.
Since [pic]< w0
Ws[pic][pic] [pic](1+[pic]( [pic],[pic]/2n)) [pic] W0 (1+[pic]( W0, [pic]/2n))
In a [pic][pic][pic] array there are fewer than 2n boundaries.
Therefore by summing up all the failure probability we get the upper bound
Some of the applications of Randomized roundings-
-VLSI wiring Problem
-Max-Sat problems
-Primality Testing
-Routing
-----------------------
1
3
4
4
3
2
2
1
................
................
In order to avoid copyright disputes, this page is only a partial summary.
To fulfill the demand for quickly locating and searching documents.
It is intelligent file search solution for home and business.