Recursion Information
Packet #4: Recursion
Applied Discrete Mathematics
Table of Contents
Recursion Information Page 1
Linear Homogeneous Recursion Page 2
Recursion Examples Pages 3-4
Solving Recursions Page 5
Towers of Hanoi Page 6
Binary Gray Codes Page 8
Big O Information Page 9
Recursion Information
I. Recursive form defines a set, an equation, or a process by defining a starting set or value and giving a rule for continuing to build the set, equation, or process based on previously defined items.
II. Closed form is a way of writing a recursive expression or process that does not depend on any previously defined items.
Recursive Form of Recursive Sequences
A recursive definition has two parts:
Basis: must define values for some finite number of elements
Recursion: remaining elements are defined by the recursion relation
Example 4.1:
Define "L" as the integer part after a division on positive integers
[pic] [pic] = 5 + [pic]
Let's define a sequence by recursion.
Basis: Q(1) = 1
Recursion: Q(n) = Q([pic]) + 1
We could develop the sequence immediately:
Q(1) = 1 Q(2) = 2 Q(3) = 2 Q(4) = 3 Q(5) = 3
Q(6) = 3 Q(7) = 3 Q(8) = 4 … Q(45) = ?
We may not need all of the previous Q's to calculate Q(45):
Q(45) = Q(22) + 1 = Q(11) + 1 + 1 = Q(5) +1 + 1 + 1 = Q(2) + 4= Q(1) + 5 = 1 + 5 = 6
Closed Form of Linear Homogeneous Recursive Sequences
Some linear recursion forms can be put into closed form by derivation:
Given: S(0) = S(0)
S(1) = S(1)
S(n) = AS(n-1) + BS(n-2)
This form lends itself to closed form representation.
Assume: S(n) = a⋅x (This is an educated guess).
S(n-1) = a⋅x
S(n-2) = a⋅x
Substitute these into the recursive definition:
S(n) = a⋅x= A⋅a⋅x+ B⋅a⋅x
a⋅x⋅ [x- Ax - B] = 0
for a⋅x≠ 0
[x- Ax - B] = 0 [pic]
Using the quadratic formula, there are two roots:
x = rand x = r
If r≠ rthen we fix our guess for S:
S= C⋅r+ C⋅r
Otherwise if r= rwe must separate the solutions by multiplying one by n:
S= C⋅r+ C⋅n⋅r
We can use S(0) and S(1) to write two equations and solve for Cand C.
Recursion Examples
Example 4.1: S(0) = 2
S(1) = 5
S(n) = 5S(n-1) - 6S(n-2)
x- 5x + 6 = 0 (x-3)(x-2)=0 Roots: x = 3 and x = 2
S= C2+ C3
S(0) = 2 = C+ C
S(1) = 5 = 2C+ 3C
C= 1 & C= 1
S= 2+ 3
Prove using Second Principle of Induction
Basis: S(0) = 2, S= 2+ 3= 1 + 1 = 2
S(1) = 5, S= 2+ 3= 5
Assume true for (n-1) and (n-2): S= 2+ 3
S= 2+3
Prove true for n: S= 2+ 3
S(n) = 5S(n-1) - 6S(n-2) (Recursive definition)
S(n) = 5[2+ 3] - 6[2+ 3] (Substitute in from Assumption)
= 5[2] - 6[2] + 5[3] - 6[3] (Group like terms)
= 2[2] + 3[3]
= 2+ 3
QED
Example 4.2: S(0) = 2
S(n) = 3S(n-1)
(x-3) = 0 Root: x = 3
S= C3
S(0) = 2 = C
S= 2⋅3
Example 4.3: S(0) = C
S(n) = aS(n-1) + b
S(1) = aC+ b
S(2) = a[aC+ b] + b = aC+ ab + b
S(3) = a[aC+ ab + b] + b = aC+ ab + ab + b
S(n) = aC+ b
S(n) = aC+ b[], a ≠ 1
Let a = 2, b = 1, C= 3
S(0) = 3
S(1) = 7
S(2) = 15
S(3) = 31
.
.
.
Using equation:
S(3) = 23 + 1⋅[]
= 24 + 7 = 31
One can use induction to prove that S(n) = aC+ b[], a ≠ 1, for n ∈ P
Example 4.4: S(0) = 2
S(1) = 9
S(n) = 6S(n-1) - 9S(n-2)
x- 6x + 9 = 0 x = 3 and x = 3 - roots are the same
S= C3+ Cn3 (Note factor of n since roots are the same!)
S(0) = 2 = C
S(1) = 9 = 3C+ 3C
C= 1
S= 2⋅3+ n3
Solving Recursive Sequences
Sometimes we consider recurrences with n/2 as an argument. In order to consider just integer values of the function, we'll map n to powers of 2.
For example:
S(1) = 3 Assume n = 2 S(2) = 3
S(n) = 2S() + 5n Then k = logn S(2) = 2*S(2[pic])+5*2
Write out first terms with no S's, leave them factored:
k = 0 S(2) = 3 = 2⋅3 + 5⋅0⋅2
k = 1 S(2) = 2⋅3 + 5⋅2= 2⋅3 + 5⋅1⋅2
k = 2 S(2) = 2[2⋅3 + 5⋅2] + 5⋅2= 2⋅3 + 5⋅2+ 5⋅2= 2⋅3 + 5⋅2⋅2
k = 3 S(2) = 2⋅3 + 5⋅2+ 5⋅2+ 5⋅2 = 2⋅3 + 5⋅3⋅2
Make a guess at S(2) by finding a pattern:
S(2) = 3*2 + 5k*2 (Alternatively, S(n) = 3n + 5n[logn])
Prove by induction
Basis: k = 0, S(2) = 3(2) + 0 = 3 = S(1)
Assume true for 2: S(2) = 3*2 + 5k*2
Prove true for 2: S(2) = 3*2+ 5(k+1)*2
S(2) = 2*S(2) + 5*2 (recursive definition)
= 2(3*2+ 5k*2) + 5*2= 3*2*2+ 5k*2*2 + 5*2
= 3*2+ 5k*2 + 5*2
= 3*2+ 5(k+1)*2
Towers of Hanoi
(We will fill this in during class.)
Object: Move a stack of n disks (stacked in order of decreasing size) from one column (e.g. A) to another designated column (e.g. C). Just one disk can be moved at a time. There are only three columns allowed and at no time can a larger disk cover a smaller disk. How many moves does it take to move all n disks?
A B C
Moving just one disk takes one move, so S(1) = 1. Moving 2 disks takes 3 moves, so S(2) = 3. How many moves does it take to move 3?
In general, if you know how to move n disks then to move (n+1) disks:
1. Move the top n as before to intermediate pole. (S(n) moves)
2. Move the bottom largest disk to final pole. (1 move)
3. Move the n from intermediate pole to final pole. (S(n) moves)
1 3
2
Then the recursion can be written:
S(1)=1 basis
S(n+1) = S(n) + 1 + S(n) = 2S(n) + 1
Towers of Hanoi continued
Use the recursion to fill out the table:
|n |S(n) |n |S(n) |
|1 |1 |5 |31 |
|2 |3 |6 |? |
|3 |7 |7 |? |
|4 |15 |8 |? |
Guess: S(n) = 2- 1
Prove by induction:
Basis: S(1) = 2- 1 = 1
Assume true for n: S(n) = 2- 1
Prove true for (n+1): S(n+1) = 2- 1
S(n+1) = 2S(n) + 1 (from recursion)
= 2(2- 1) + 1 (substitute in from Assumption)
= 2- 2 + 1 = 2- 1
A Recursion for Gray Codes
A binary gray code is a list of binary strings such that the neighbors of any string differ by only 1 bit. We say such a list is "gray".
Basis: S(1) = 0, 1 differ by only one
Recursion: S(n) = 0S(n-1),1S'(n-1) (S'(n-1) is S(n-1) in reverse order)
S(2) = 00, 01, 11, 10
Let's prove that S(n) is always a gray code.
Basis: S(1) = 0,1 is gray (by inspection)
Assume: S(n) is gray
Prove: S(n+1) is gray
S(n+1) = 0S(n),1S'(n)
Prove 0S(n) and 1S'(n) are gray:
Since S(n) is gray, 0S(n) is gray.
Since S(n) is gray, S'(n) is gray, and so is 1S'(n).
Prove the last string of 0S(n) differs in only one bit from the first string of 1S'(n):
Since S'(n) is the reverse of S(n), the last string (say x) in S(n) is equal to the first string in S'(n). Therefore, the last string of 0S(n) is 0x and the first string of 1S'(n) is 1x, and these two strings differ only by 1 bit.
Prove the last string of 1S'(n) differs in only one bit from the first string of 0S(n):
Since S'(n) is the reverse of S(n), the last string (say y) in S'(n) is equal to the first string in S(n). Therefore, the last string of 1S'(n) is 1y and the first string of 0S(n) is 0y, and these two strings differ only by 1 bit.
Since 0S(n) and 1S'(n) are gray and their first and last strings differ by just one bit each, S(n+1) is gray.
Big-O Information
I. Definition:
If f and g are functions then f is Big O of g, or f(x) is in O(g(x)), if there exist constants C and k so that
(x (x > k) ( |f(x)| ( C|g(x)| .
In other words, f(x) is smaller than some constant times g(x) whenever x is large enough. O(g(x)) actually defines a set of functions that grow at the rate of a constant times g, or at a slower rate.
II. Generating a Big-O for a function
The general rules:
I. O(c*g(x)), where c is constant (independent of x), is O(g(x)).
II. If f(x) ( g(x) then f(x) + g(x) ( 2 * g(x), so O(f(x) + g(x)) = O(g(x)).
III. If a sum has more than 2 terms, count the number of terms the sum has. Then if the sum has m terms, and g(x) is the largest term, then the sum is ( m* g(x) ( O(sum) = O(m * g(x)). Note that m might depend on x!
IV. O(f(x) * g(x)) = O(f(x)) * O(g(x))
V. O(f(x)/g(x)) = O(f(x))/O(g(x)) ** Be careful with division!
Big-O
Formal Definition: f(n) grows no faster than g(n)
f(n) is O(g(n))
if there is a constant c such that f(n) ≤ cg(n) for all n ≥ some k
Purpose: to describe growth rate of a function
e. g. does it grow like log(n)?
like n?
like n?
e. g. 2n+ 3n - 1 "grows like" n
nlog(n) - 3n + 7 "grows like" nlog(n)
Big-O is used to denote an upper bound on growth rate.
Examples
1. n- n ≤ nfor all n ≥ 0
Therefore, n- n is O(n)
2. n + 3 ≤ n + n = 2n for n ≥ 3
Therefore, n + 3 is O(n)
3. 5n + log(n) + 1 ≤ 5n + n + n = 7n for n ≥ 1
Therefore, 5n + log(n) + 1 is O(n)
4. 2n- 5n+ 6n - 7 is O(n)
since 2n- 5n+ 6n - 7 ≤ 2n+ 6n ≤ 2n+ 6n= 8nfor n ≥ 1
5. n+ 1 is O(n)
since n+ 1 ≤ 2n≤ 2nfor n ≥ 1
(However, n+ 1 is also O(n).)
6. log(n) is O(log(n))
since log(n) = 3log(n)
Example of Definition of Big-O
Function f(x) is O(g(x)) ⇔ ∃c∃n∀n[(n ≥ n) ⇒ [f(n) ≤ cg(n)]]
(f(x) is O(g(x)) ⇔ there is a constant c and a value nsuch that f(n) ≤ cg(n) whenever n ≥ n)
Negate: (definition of Big-O)
∃c∃n∀n[(n ≥ n) → f(n) ≤ cg(n)] Big-O
¬[∃c∃n∀n[(n ≥ n) → f(n) ≤ cg(n)]] negation of Big-O
∀c∀n∃n¬[(n ≥ n) → f(n) ≤ cg(n)]
∀c∀n∃n¬[¬(n ≥ n) ∨ (f(n) ≤ cg(n))]
∀c∀n∃n[(n ≥ n) ∧ ¬(f(n) ≤ cg(n))]
∀c∀n∃n[(n ≥ n) ∧ (f(n) > cg(n))] negation of Big-O
Merge Sort Recurrence
C(2) = 1
C(n) = 2C() + n for n ≥ 4
Solving for n = 2
C(2) = 1
C(2) = (2 ⋅ 1) + 2
C(2) = (2⋅ 1 + 2) + 2
C(2) = (2⋅ 1 + 2+ 2) + 2
C(2) = (2⋅ 1 + 2+ 2+ 2) + 2
.
.
.
C(2) = 2+ (k-1)2
= 2+ k2- 2= k2+ 2= k2-
C(n) = nlog(n) - n (Can prove by induction)
Therefore, O(nlog(n))
Binary Search
Search for x in sorted array A of n elements.
BINSRCH(x,A,n)
if n ≠ 1 then
split A into 2 sorted arrays B and C each with elements
[Note: (last(B) ≤ first(C))]
(1) if x is smaller than first(C) then
C() BINSRCH(x,B,)
or
C() else BINSRCH(x,C,)
(1) else {n = 1} check if the only element in A is x
Solving Binary Search Recurrence
C(1) = 1 Assume n = 2
C(n) = C() + 1 Then k = log
C(1) = 1
C(2) = (1) + 1
C(2) = (1 + 1) + 1
C(2) = (1 + 1 + 1) + 1
C(2) = (1 + 1 + 1 + 1) + 1
.
.
.
C(2) = k + 1 (guess)
C(n) = log+ 1 (can prove by induction)
................
................
In order to avoid copyright disputes, this page is only a partial summary.
To fulfill the demand for quickly locating and searching documents.
It is intelligent file search solution for home and business.
Related searches
- c recursion reverse string
- python recursion examples
- c recursion example
- reverse string recursion java
- python recursion limit
- recursion in python
- simple recursion examples
- recursion in c programming
- max recursion depth python
- python increase recursion depth
- when to use recursion programming
- python set recursion depth