C
Chad
I was looking at some old posts in comp.lang.c and found the following
http://groups.google.com/group/comp...nk=gst&q=recurrence+relation#b3b5046326994e18
I have some questions regarding the post.
First, and I quote
"Given an algorithm with loops or recursive calls, the way you find
a big-O equivalence class for that algorithm is to write down a
"recurrence relation" for the time taken to execute the code. For
instance, for merge sort, one gets (fixed width font required here;
$x$ denotes a variable x, etc.):
{
T(n) = { $a$, $n = 1$, $a$ a constant
{ $2T(n/2) + cn$, $n > 1$, $c$ a constant
You then apply induction-type reasoning to show that, e.g., when
$n$ is a power of two, $T(n) = an + cn \log_2 n$. In big-O notation,
all the constant factors vanish, so this shows that mergesort is
O(n log n). "
First, how do you get this recurrence relation for mergesort? Second,
how did he get O(n log n)
And how, later on
"(Personally, I always found the worst part of dealing with
recurrence
relations to be going from an open form to a closed one -- you just
have to have memorized all those formulae like "1 + 2 + ... + n =
n(n+1)/2", etc., and recognize them in the recurrences. Once you
can see this summation and recognize the closed form, it instantly
becomes obvious that, e.g., bubble sort is O(n*n).) "
I don't see how O(n*n) can be derived from 1 + 2 + ... + n = n(n+1)/2
"
http://groups.google.com/group/comp...nk=gst&q=recurrence+relation#b3b5046326994e18
I have some questions regarding the post.
First, and I quote
"Given an algorithm with loops or recursive calls, the way you find
a big-O equivalence class for that algorithm is to write down a
"recurrence relation" for the time taken to execute the code. For
instance, for merge sort, one gets (fixed width font required here;
$x$ denotes a variable x, etc.):
{
T(n) = { $a$, $n = 1$, $a$ a constant
{ $2T(n/2) + cn$, $n > 1$, $c$ a constant
You then apply induction-type reasoning to show that, e.g., when
$n$ is a power of two, $T(n) = an + cn \log_2 n$. In big-O notation,
all the constant factors vanish, so this shows that mergesort is
O(n log n). "
First, how do you get this recurrence relation for mergesort? Second,
how did he get O(n log n)
And how, later on
"(Personally, I always found the worst part of dealing with
recurrence
relations to be going from an open form to a closed one -- you just
have to have memorized all those formulae like "1 + 2 + ... + n =
n(n+1)/2", etc., and recognize them in the recurrences. Once you
can see this summation and recognize the closed form, it instantly
becomes obvious that, e.g., bubble sort is O(n*n).) "
I don't see how O(n*n) can be derived from 1 + 2 + ... + n = n(n+1)/2
"