Probability



4.6 Steady State Probablilites.

In the previous section we saw how to compute the transition matrices P(t) = etQ where Q is the generator matrix. We saw that each element of P(t) was a constant plus a sum of multiples of et(j where the (j are the eigenvalues of the generator matrix Q. The non-zero eigenvalues of Q are either negative or have negative real part. So the non-constant et(j go to zero as t ( (. So P(t) approaches a limiting matrix, which we shall denote by P((), as t tends to (. This is similar to what happened in the case of a Markov chain. The rows of this limiting matrix contain the probabilities of being in the various states as time gets large. These probabilities are called steady state probabilities. Let's look at the examples whose of the previous section.

Example 1. (a continuation of Example 1 in section 4.5) We have a machine which at any time can be in either of two states, working = 1 or broken = 2. Let X(t) be the random variable corresponding to the state of the copier at time t. Suppose the time between transitions between working and broken are exponential random variables with mean 1/2 week so q12 = 2, and the time between transitions between broken and working are exponential random variables with mean 1/9 week so q21 = 9. Suppose all these random variables along with X(0) are independent so that X(t) is a Markov process. The generator matrix is

Q = .

In the previous section we saw that

P(t) = etQ =

As t ( ( one has

P(t) = ( P(() =

since e-11t ( 0. In fact within a week one has P(t) quite close to P(() since e-11 ( 0.0002. So for a day more than a week in the future one has

Pr{working} (

Pr{broken} (

no matter whether it is working or broken now. In the long run it is working 9/11 of the time and broken 2/11 of the time. These probabilities

(1 = Pr{ X(t) = 1 | X(0) = 1} = Pr{ X(t) = 1 | X(0) = 2} =

(2 = Pr{ X(t) = 2 | X(0) = 1} = Pr{ X(t) = 2 | X(0) = 2} =

are called the steady state probabilities of being in states 1 and 2. In many applications of Markov processes the steady state probabilities are the main items of interest since one is interested in the long run behavior of the system. For the time being we shall assume the following about the generator matrix Q.

(1) The eigenvalue zero is not repeated.

(2) The other eigenvalues are negative or have negative real part.

Then as t ( ( the transition matrix

P(t) = etQ = TetDT-1 = T T-1

approaches

P(() = T-1 = T-1

= =

where we let

T-1 =

So

P(() =

So the rows of P(() are all equal to the vector which is the first row of T-1 which is a left eigenvector of Q with eigenvalue 0. In other words

(3) (Q = 0

where

( = ((1, (2, …, (n)

So the rows of P(() are vectors π which are solutions of the equation πQ = 0 and which are probability vectors, i.e. the sum of the entries of π equals 1. The entries of π are called steady state probabilities.

Another way to see that the rows of P(() satisfy (3) is to start with the Kolmogoroff equation = P(t)Q and let t ( (. Since P(t) ( P(() one has ( P(()Q. However, the only way this can be consistent is for ( o. So P(()Q = 0. Since the ith row of P(()Q is the ith row of P(() times Q we see the rows of P(() satistfy (3).

Example 1 (continued). Find the steady state vector ( = ((1, (2) in Example 1.

Since (Q = 0 we get

((1, (2) = (0, 0)

Therefore

- 2(1 + 9(2 = 0

2(1 + 9(2 = 0

Therefore (2 = 2(1/9

In order to find (1 and (2 we need to use the fact that (1 + (2 = 1. Combining this with (2 = 2(1/9 gives (1 + 2(1/9 = 1 or 11(1/9 = 1 or (1 = 9/11 and (2 = 2/11.

Example 2. (see Example 1 of section 4.1) The condition of an office copier at any time t is either good = 1, poor = 2 or broken = B. In Example 1 of section 4.1 the generator matrix was

Q = =

The equation (Q = 0 is

- (1 + (2 + (3 = 0

(1 - (2 + (3 = 0

(1 + (2 - (3 = 0

These equations are dependent since they sum to 0 = 0. So we will only use the first two equations. Multiplying the first equation by 50, the second by 100 we get

- 10 (1 + (2 + 24 (3 = 0

5 (1 - 50 (2 + 12 (3 = 0

Multiply the second equation by 2 and add to the first equation.

- 10 (1 + (2 + 24 (3 = 0

10 (1 - 100 (2 + 24 (3 = 0

_____________________________

- 99 (2 + 48 (3 = 0

Dividing by 3 gives

- 33 (2 + 16 (3 = 0

So

(2 = (3

Substituting into - 10(1 + (2 + 24 (3 = 0 we get

(1 = = = (3 = (3

= (3

Substituting into (1 + (2 + (3 = 1 gives (3 + (3 + (3 = 1 or (3 = 1. Therefore

(3 = ( 0.25 (2 = ( 0.12 (1 = ( 0.62

So the copier is in good condition about 62% of the time, in poor condition about 12% of the time and broken about 25% of the time.

Example 3. (see Example 2 of section 4.1) The time between customer arrivals at Sam's Barber Shop is an exponential random variable with mean 1/2 hour. However, customers will only wait if there is no one already waiting. The time it takes Sam to give a haircut is an exponential random variable with mean 1/3 hour. Assume all times are independent of all the other times. Let Nt be the number of customers in the shop including the one currently getting a haircut. The generator matrix is

Q =

The equation (Q = 0 is

- 2(0 + 3(1 = 0

2(0 - 5(1 + 3(2 = 0

2(1 - 3(2 = 0

Again, these equations are dependent since they sum to 0 = 0. So we will only use the first and third equations. The first equation says (0 = (1 and the third equation says (2 = (1. Substituting into (0 + (1 + (2 = 1 gives (1 + (1 + (1 = 1 or (1 = 1. Therefore

(1 = ( 0.32 (0 = ( 0.47 (2 = ( 0.21

So the barber has no customers about 47% of the time, one customer about 32% of the time and two customers about 21% of the time.

Example 4. (this is problem 4.1 in section 4.8 of Durett) A salesman flies around between Atlanta (state 1), Boston (state 2) and Chicago (state 3). The time spent in Atlanta is an exponential random variable with mean 1/4 week. Upon leaving Atlanta, he is equally likely to go to Boston or Chicago. The time spent in Boston is an exponential random variable also with mean ¼ week. After leaving Boston he has a probability of ¾ of going to Atlanta and ¼ of going to Chicago. The time spent in Chicago is an exponential random variable with mean 1/5 week. After leaving Chicago he always goes to Atlanta. Assume all times and where he goes after leaving a city are independent of all the other times and where he goes. Let Xt be his location at time t. The generator matrix is

Q =

The equation (Q = 0 is

- 4(1 + 3(2 + 3(3 = 0

2(1 - 4(2 = 0

2(1 + (2 - 5(3 = 0

Again, these equations are dependent since they sum to 0 = 0. So we will only use the second and third equations. The second equation says (1 = 2(2. Putting this in the third equation says 4(2 + (2 - 5(3 = 0. So (3 = (2. Substituting into (1 + (2 + (3 = 1 gives 2(2 + (2 + (2 = 1 or 4(2 = 1. Therefore

(2 = = 0.25 (1 = = 0.5 (3 = = 0.25

So the salesman spends half the time in Atlanta and ¼ of the time in each of Boston and Chicago.

................
................

In order to avoid copyright disputes, this page is only a partial summary.

Google Online Preview   Download