CHAPTER 1 PROBABILITY



1. COMBINATORIAL ANALYSIS

1.1 Counting Principles

1. Theorem (The basic principle of counting): If the set E contains n elements and the set F contains m elements, there are nm ways in which we can choose, first, an element of E and then an element of F.

2. Theorem (The generalized basic principle of counting): If r experiments that are to be performed are such that the first one may result in any of [pic] possible outcomes, and if for each of these [pic] possible outcomes there are [pic] possible outcomes of the second experiment, and if for each of the possible outcomes of the first two experiments there are [pic] possible outcomes of the third experiment, and if …, then there is a total of [pic], possible outcomes of the r experiments.

3. Theorem: A set with n elements has [pic] subsets.

4. Tree diagrams

[pic]

1.2 Permutations

1. Permutation: [pic]

The number of permutations of n things taken r at a time: [pic]

2. Theorem: The number of distinguishable permutations of n objects of k different types, where [pic] are alike, [pic] are alike, …, [pic]are alike and [pic], is [pic]

1.3 Combinations

1. Combination: The number of combinations of n things taken r at a time: [pic] (combinatorial coefficient; binomial coefficient)

2. Binomial theorem: [pic]

3. Multinomial expansion: In the expansion of [pic], the coefficient of the term [pic], [pic], is [pic]. Therefore, [pic]. Note that the sum is taken over all nonnegative integers [pic], [pic], …, [pic] such that [pic].

1.4 The Number of Integer Solutions of Equations

1. There are [pic] distinct positive integer-valued vectors [pic] satisfying [pic], [pic], [pic].

2. There are [pic] distinct nonnegative integer-valued vectors [pic] satisfying [pic].

[pic]

[pic]

2. AXIOMS OF PROBABILITY

2.1 Sample Space and Events

1. Set theory concepts: set, element, roster method, rule method, subset, null set (empty set).

2. Complement: The complement of an event A with respect to S is the subset of all elements of S that are not in A. We denote the complement of A by the symbol A’ ([pic]).

3. Intersection: The intersection of two events A and B, denoted by the symbol [pic], is the event containing all elements that are common to A and B.

-- Two events A and B are mutually exclusive, or disjoint, if [pic] that is, if A and B have no elements in common.

4. The union of the two events A and B, denoted by the symbol [pic], is the event containing all the elements that belong to A or B or both.

5. Venn diagram: [pic]

6. Sample space of an experiment: All possible outcomes (points)

7. Events: subsets of the sample space

impossible events (impossibility): [pic]; sure events (certainty): S.

[pic]

8. DeMorgan’s laws: [pic]

2.2 Axioms of Probability

1. Probability axioms: (1) [pic];

(2) [pic];

(3) [pic] if [pic] is a sequence of mutually exclusive events.

2. Equally likely outcomes: the probabilities of the single-element events are all equal

2.3 Basic Theorems

1. (1) [pic];

(2) [pic];

(3) complementary events: [pic];

(4)[pic]: inclusion-exclusion principle

(5) If A1, A2,…, An is a partition of sample space S, then [pic]

(6) If A and A’ are complementary events, then [pic].

3. CONDITIONAL PROBABILITY AND INDEPENDENCE

3.1 Conditional Probability

1. Conditional probability: [pic].

2. If in an experiment the events A and B can both occur, then [pic].

[pic],

The multiplication rule: [pic].

3. Partition: Let [pic] be a set of nonempty subsets of the sample space S of an experiment. If the events [pic] are mutually exclusive and [pic], the set [pic] is called a partition of S.

4. Theorem of total probability: If [pic] is a partition of S, and A is any event, then [pic].

5. Bayes’ Theorem: If [pic] is a partition of S, and A is any event, then [pic].

3.2 Independence

1. Independent events: If A, B are independent events [pic].

2. Theorem: If A and B are independent, then A and [pic]; [pic] and [pic] are independent.

3. The events A, B, and C are called independent if [pic], [pic], [pic], [pic]. If A, B, and C are independent events, we say that [pic] is an independent set of events.

4. The set of events [pic] is called independent if for every subset [pic], [pic], of [pic], [pic].

4. DISTRIBUTION FUNCTIONS AND DISCRETE RANDOM VARIABLES

4.1 Random Variable

1. Random variables X: Let S be the sample space of an experiment. A real-valued function [pic] is called a random variable of the experiment if, for each interval [pic], [pic] is an event.

2. Probability function [pic]: (a) [pic] if [pic]; (b) [pic] if [pic] (c) [pic].

4.2 Distribution Functions

1. Cumulative Distribution Functions (cdf): [pic], [pic].

2. [pic] is non-decreasing; [pic]; [pic].

3. If[pic], then [pic]; [pic]; [pic].

4. The cdf of a discrete random variable: a step function.

4.3 Expectations of Discrete Random Variables

1. Expected value (mean value or average value or expectation) for a random variable X: [pic].

2. Let g be a real-valued function. Then g(X) is a random variable with [pic].

3. Let [pic] be real-valued functions, and let [pic] be real number. Then [pic].

4.4 Variances of Discrete Random Variables

1. The variance of a random variable: the average square distance between X and its mean [pic]

[pic].

2. Standard deviation: [pic].

3. Let X be a discrete random variable, then [pic] if and only if X is a constant with probability 1.

4. Let X be a discrete random variable, then for constants a and b: [pic], [pic].

5. SPECIAL DISCRETE DISTRIBUTIONS

5.1 Bernoulli and Binomial Random Variables

1. Bernoulli trials: an experiment with two different possible outcomes

Bernoulli random variable X with parameter p, p is the probability of a success.

[pic]

expected value: [pic]; variance: [pic]

Example: If in a throw of a fair die the event of obtaining 4 or 6 is called a success, and the event of obtaining 1, 2, 3, or 5 is called a failure.

2. Binomial distribution: number of successes to occur in n repeated, independent Bernoulli trials

binomial random variable Y with parameters n and p

[pic]

Expected value: [pic]; Variance: [pic]

Example: A restaurant serves 8 entrees of fish, 12 of beef, and 10 of poultry. If customers select from these entrees randomly, what is the probability that two of the next four customers order fish entrees?

5.2 Multinomial Random Variables

1. Multinomial trials: an experiment with [pic] different possible outcomes

2. Multinomial distribution: n independent multinomial trials

multinomial random variable [pic] with parameters [pic]; [pic]: the number of ith outcomes; [pic]; [pic]; [pic]

[pic]

Example: Draw 15 balls with replacement from a box containing 20 red, 10 white, 30 black, and 50 green. What is the probability of 7R, 2W, 4B, 2G?

5.3 Geometric distribution: trial number of the first success to occur in a sequence of independent Bernoulli trials

geometric random variable N with parameter p

[pic]

geometric series: [pic]

expected value: [pic]; variance: [pic]

Example: From an ordinary deck of 52 cards we draw cards at random, with replacement, and successively until an ace is drawn. What is the probability that at least 10 draws are needed?

Memoryless property of geometric random variables: In successive independent Bernoulli trials, the probability that the next n outcomes are all failures does not change if we are given that the previous m successive outcomes were all failures. [pic]

5.4 Negative binomial distribution: trial number of the rth success to occur

negative binomial random variable [pic] with parameters r, p

[pic]

[pic]

expected value: [pic]; variance: [pic]

Example: Sharon and Ann play a series of backgammon games until one of them wins five games. Suppose that the games are independent and the probability the Sharon wins a game is 0.58. Find the probability that the series ends in seven games.

5.5 Poisson Distribution

1. The Poisson probability function: [pic]

Poisson random variable K with parameter [pic]

expected value: [pic]; variance: [pic]

2. The Poison approximation to the binomial: If X is a binomial random variable with parameters n and [pic], then [pic]

Example (Application of the Poisson to the number of successes in Bernoulli Trials and the number of Arrivals in a time period): your record as a typist shows that you make an average of 3 mistakes per page. What is the probability that you make 10 mistakes on page 437?

3. Poisson processes

Example: Suppose that children are born at a Poisson rate of five per day in a certain hospital. What is the probability that at least two babies are born during the next six hours?

6. CONTINUOUS RANDOM VARIABLES

6.1 Probability Density Functions

1. Densities

2. Probability density function (pdf) for a continuous random variable X: [pic]

[pic] for all x; [pic]; [pic]; [pic]; [pic]; [pic]; [pic]; [pic]

Example: Experience has shown that while walking in a certain park, the time X, in minutes, between seeing two people smoking has a density function of the form: [pic]. (a) Calculate the value of [pic]. (b) Find the probability distribution function of X. (c) What is the probability that Jeff, who has just seen a person smoking, will see another person smoking in 2 to 5 minutes?

6.2 Cumulative Distribution Functions (cdf)

1. [pic], [pic]

2. The cdf of a discrete random variable: a step function; the cdf of a continuous random variable: a continuous function

3. The probability function of a discrete random variable: size of the jump in [pic]; the pdf of a continuous random variable: [pic]

6.3 Expectations and Variances

1. Definition: If X is a continuous random variable with pdf [pic], the expected value of X is defined by [pic].

Example: In a group of adult males, the difference between the uric acid value and 6, the standard value, is a random variable X with the following pdf: [pic]. Calculate the mean of these differences for the group.

2. Theorem: Let X be a continuous random variable with pdf [pic]; then for any function h: [pic].

3. Corollary: Let X be a continuous random variable with pdf [pic]. Let [pic] be real-valued functions and [pic] be real numbers. Then [pic].

4. Definition: If X is a continuous random variable with [pic], then [pic] and [pic], called the variance and standard deviation of X, respectively, are defined by [pic], [pic].

7. SPECIAL CONTINUOUS DISTRIBUTIONS

7.1 Uniform Random Variable

1. Density of a uniformly distributed random variable: [pic].

2. The cdf of a uniformly distributed random variable:

[pic]

3. The expected value and variance of the uniform random variable: [pic]; [pic].

Example: Starting at 5:00 A.M., every half hour there is a flight from San Francisco airport to Los Angeles International airport. Suppose that none of these planes is completely sold out and that they always have room for passengers. A person who wants to fly to L.A. arrives at the airport at a random time between 8:45 A.M. and 9:45 A.M. Find the probability that she waits (a) at most 10 minutes; (b) at least 15 minutes.

7.2 The Exponential Distribution

1. The exponential probability law with parameter [pic]: [pic]

[pic] is the time of occurrence of the first event in a Poisson process with parameter [pic], starting at an arbitrary origin [pic]; [pic], Poisson process [pic] is the number of events to occur with parameter [pic].

2. The expected value and variance of the exponential random variable: [pic]; [pic].

Example: Suppose that every three months, on average, an earthquake occurs in California. What is the probability that the next earthquake occurs after three but before seven months?

3. Memoryless feature of the exponential: [pic], a, b are any two positive constants.

7.3 The Erlang Distribution

1. The cdf and pdf for [pic]: [pic]

[pic] is the time of occurrence of the second event in a Poisson process with parameter [pic], starting at an arbitrary origin [pic]; [pic], Poisson process [pic] is the number of events to occur with parameter [pic].

2. The Erlang probability law with parameters r and [pic]: [pic]

[pic] is the time of occurrence of the rth event in a Poisson process with parameter [pic], starting at an arbitrary origin [pic]; [pic], [pic] is the number of events to occur.

3. The expected value and variance of the Erlang random variable: [pic]; [pic].

Example: Suppose that, on average, the number of [pic]-particles emitted from a radioactive substance is four every second. What is the probability that it takes at least 2 seconds before the next two [pic]-particles are emitted?

7.4 The Gamma Distribution

1. The gamma probability law with parameters n and [pic]: [pic] [pic].

Gamma function: [pic]; [pic]; [pic], if n is a positive integer.

The Erlang random variable is a particular case of a gamma random variable, where n is restricted to the integer values [pic].

2. The expected value and variance of the gamma random variable: [pic]; [pic].

7.5 The Normal (Gaussian) Distribution

1. The normal probability law with parameters [pic] and [pic]: [pic], for [pic], [pic].

2. The expected value and variance of the normal random variable: [pic]; [pic].

3. Standard normal random variable (the unit normal): [pic], [pic], [pic].

4. The new random variable: If X is normal with mean [pic] and variance [pic], then [pic] is normal with mean [pic] and variance [pic]; aX is normal with mean [pic] and variance [pic]; [pic] is normal with mean [pic] and variance [pic].

5. Switching from a non-unit normal to the unit normal: [pic], [pic], [pic].

Example: Suppose that height X is normal with [pic] inches and [pic]. You can find the probability that a person picked at random is under 6 feet, using only the table for the standard normal.

6. The probability that a normal random variable is with k standard deviations of the mean: [pic], [pic] is the unit normal.

7. The normal approximation to the binomial: A binomial distribution X with parameters n, p where n is large, then X is approximately normal with [pic], [pic].

Example: A coin has [pic]. Toss the coin 1000 times so that the expected number of heads is 300. Find the probability that the number of heads is 400 or more.

7.6 The Election Problem

4.6 Functions of a Random Variable

1. The distribution function (cdf) method: [pic], if X is a random variable with cdf [pic], g is a monotonic function for [pic], [pic]

2. The density function (pdf) method: [pic]

4.7 Simulating a Random Variable

1. Simulating a continuous random variable

2. Simulating a discrete random variable

3. Simulating a mixed random variable

8. JOINTLY DISTRIBUTED RANDOM VARIABLES

8.1 Joint Densities

1. Jointly distributed random variable: If the observed values for two or more random variables are simultaneously determined by the same random mechanism.

2. The discrete random variables [pic], [pic]: the probability function is [pic]; [pic] for all [pic], [pic]; [pic]. The continuous random variables [pic], [pic]: the pdf [pic] is positive; [pic] for all [pic], [pic]; [pic]; the probability is given by integrating [pic].

3. The joint density of independent random variables: The random variables X, Y are independent if and only if [pic], when X, Y are discrete; [pic], when X, Y are continuous.

4. Uniform joint densities: If X and Y are jointly uniform on a region, the joint pdf is [pic], for (x, y) in the region.

The joint density of independent uniform random variables: [pic]

8.2 Marginal Densities

1. The discrete random variables [pic], [pic] with the probability function [pic]and range [pic], the marginal probability function for [pic] is [pic] for [pic]. [pic] is the marginal range for [pic], the set of first elements of [pic].

2. The continuous random variables [pic], [pic] with pdf [pic]and range [pic], the marginal pdf for [pic] is [pic] for [pic]. [pic] is the marginal range for [pic], the set of first elements of [pic].

8.3 Functions of Several Random Variables

8.4 Sums of Independent Random Variables

1. The sum of independent binomials with a common p: [pic] with parameters [pic], [pic] with parameters [pic], then [pic] with parameters [pic]

2. The sum of independent Poissons: [pic] with parameter [pic], [pic] with parameters [pic], then [pic] with parameters [pic]

3. The sum of independent exponentials with a common [pic]: [pic] with parameter [pic], then [pic] has a gamma(Erlang) distribution with parameters [pic].

4. The density of the sum of two arbitrary independent random variables: the convolution of the individual pdf, [pic]

5. The sum of independent normals: [pic] with mean [pic] and variance [pic], [pic] with mean [pic] and variance [pic], then [pic] with mean [pic] and variance [pic]

9. EXPECTATION

9.1 Expectation of a Random Variable

1. Definition of expected value (continuous): [pic]

2. Mean of a uniformly distributed random variable: [pic], if X is uniform on [pic]

3. The expected value of [pic], some function of X: [pic], if X is discrete; [pic], if X is continuous. The expected value of [pic], some function of X and Y: [pic]

4. Properties of Expectation: i. [pic] (constant k); ii. [pic]; iii. [pic] (constant k); iv. If X and Y are independent then [pic].

8. The expected value of the normal random variable: [pic]

9.2 Variance

1. Definition of variance (continuous): [pic]

2.

3. The variance of the Erlang random variable:

4. The variance of the gamma random variable:

5. The variance of the normal random variable: [pic]

5.2 Conditional Distributions and Independence

1. Definition 5.3: Conditional probability function

2. Definition 5.4: Conditional pdf

3. Definition 5.5: Independence

4. Definition 5.6:

5.3 Multinomial and Bivariate normal Probability Laws

1. Multinomial trial

2. Theorem 5.3: Multinomial probability function

3. Bivariate normal probability law

................
................

In order to avoid copyright disputes, this page is only a partial summary.

Google Online Preview   Download