Random Number Generators - Columbia University

Random Number Generators

Professor Karl Sigman Columbia University Department of IEOR

New York City USA

1/17

Introduction

Your computer "generates" numbers U1, U2, U3, . . . that are considered independent and uniformly randomly distributed on the continuous interval (0, 1).

Recall that the probability distribution (cumulative distribution function) of such a uniformly distributed random variable U is given by

F(x) = P(U x) = x, x (0, 1), and more generally, for 0 y < x < 1,

P(y < U x) = x - y.

In other words, the probability that U falls in an interval (y, x] is just the length of the interval, x - y.

2/17

Introduction

Independence means that regardless of the values of the first n random numbers, U1, . . . , Un, the value of the next one, Un+1, still has the same uniform distribution over (0, 1); it is not in any way effected by those previous values.

This is analogous to sequentially flipping a (fair) coin: regardless of the first n flips, the next one will still land heads (H) or tails (T) with probability 1/2.

The sequence of random variables (rvs) U1, U2, . . . is an example of an independent and identically distributed(iid) sequence. Here, the identical distribution is the uniform over (0, 1), which is a continuous analog of "equally likely" probabilities.

3/17

Introduction

In Python, for example, you can obtain such U as follows: import random U = random.random() Once random is imported, then each time you use the command U = random.random() you receive a new uniform number within [0, 1).

4/17

Introduction

It turns out once we have access to such uniform numbers U, we can use them to construct ("simulate/generate") random variables of any desired distribution, construct stochastic processes such as random walks, Markov chains, Poisson processes, renewal processes, Brownian motion and many other processes. Suppose for example, that we want a random variable X that has an exponential distribution at rate : The cumulative distribution function (CDF) is given by

F(x) = P(X x) = 1 - e-x , x 0.

5/17

Introduction

Then simply define

X

=

1 -

ln (U),

where ln (y) denotes the natural logarithm of y > 0.

Proof:

P(X x)

=

P (-

1

ln

(U)

x)

= P(ln (U) -x)

= P(U e-x )

= 1 - e-x .

(Recall that P(U y) = 1 - y, y (0, 1).)

6/17

Pseudorandom numbers

It turns out that the numbers generated by a computer are not really random nor independent as we said, but what are called Pseudorandom numbers.

This means that they appear, for all practical purposes, to be random (and independent) in the sense that they would pass various statistical tests for checking the random/independent property. We thus can use them in our simulations as if they were truly random--and we do.

Next we will discuss how the computer generates these numbers. This is deeply related to "cryptography" in which one wants to hide important information (from an opponent/enemy) in data that "appears" to be random.

7/17

Pseudorandom numbers

As an example to get you to think: Suppose I hand you a sequence of zeros and ones:

(0, 1, 1, 0, 1, 0, 1, 0, 0, 0, 1, 1, 0, 1, 1, 0, 1, 1, 0, 1, 0, 1, 0, 0, 1). I tell you that I flipped a fair coin 25 times where 1 = Head (H), and 0 = Tails (T). How can you check with certainty that I am telling the truth? ANSWER: You can't

8/17

................
................

In order to avoid copyright disputes, this page is only a partial summary.

Google Online Preview   Download