# [OT] stable algorithm with complexity O(n)

Discussion in 'Python' started by David Hláèik, Dec 13, 2008.

1. ### David HláèikGuest

Hi guys,

i am really sorry for making offtopic, hope you will not kill me, but
this is for me life important problem which needs to be solved within
next 12 hours..

I have to create stable algorithm for sorting n numbers from interval
[1,n^2] with time complexity O(n) .

Can someone please give me a hint. Would be very very thankful!

D.

David Hláèik, Dec 13, 2008

2. ### Diez B. RoggischGuest

David HlÃ¡Äik schrieb:
> Hi guys,
>
> i am really sorry for making offtopic, hope you will not kill me, but
> this is for me life important problem which needs to be solved within
> next 12 hours..
>
> I have to create stable algorithm for sorting n numbers from interval
> [1,n^2] with time complexity O(n) .
>
> Can someone please give me a hint. Would be very very thankful!

Unless I grossly miss out on something in computer science 101, the
lower bound for sorting is O(n * log_2 n). Which makes your task
impossible, unless there is something to be assumed about the
distribution of numbers in your sequence.

Who has given you that assignment - a professor? Or some friend who's
playing tricks on you?

Diez

Diez B. Roggisch, Dec 13, 2008

3. ### David HláèikGuest

> Unless I grossly miss out on something in computer science 101, the lower
> bound for sorting is O(n * log_2 n). Which makes your task impossible,
> unless there is something to be assumed about the distribution of numbers in

There is n numbers from interval [1 , n^2]
I should do measurement for :
n = 500, 1000, 1500, 2000, ..., 4500, 5000

O(n) means linear complexivity, so complexivity of algorithmus must be
linear, which i have to prove.

>
> Who has given you that assignment - a professor? Or some friend who's
> playing tricks on you?

It is official assignment , by professor from Data Structures &
Algorithms course.

>
> Diez
> --
> http://mail.python.org/mailman/listinfo/python-list
>

David Hláèik, Dec 13, 2008
4. ### MRABGuest

Diez B. Roggisch wrote:
> David HlÃ¡Äik schrieb:
>> Hi guys,
>>
>> i am really sorry for making offtopic, hope you will not kill me, but
>> this is for me life important problem which needs to be solved within
>> next 12 hours..
>>
>> I have to create stable algorithm for sorting n numbers from interval
>> [1,n^2] with time complexity O(n) .
>>
>> Can someone please give me a hint. Would be very very thankful!

>
> Unless I grossly miss out on something in computer science 101, the
> lower bound for sorting is O(n * log_2 n). Which makes your task
> impossible, unless there is something to be assumed about the
> distribution of numbers in your sequence.
>
> Who has given you that assignment - a professor? Or some friend who's
> playing tricks on you?
>

I'm assuming that the numbers are integers. I can think of an O(n)
algorithm for this particular problem provided that n isn't huge, but if
it's your assignment then it's up to you to discover it.

MRAB, Dec 13, 2008
5. ### Diez B. RoggischGuest

Duncan Booth schrieb:
> "Diez B. Roggisch" <> wrote:
>
>> David HlÃ¡Äik schrieb:
>>> Hi guys,
>>>
>>> i am really sorry for making offtopic, hope you will not kill me, but
>>> this is for me life important problem which needs to be solved within
>>> next 12 hours..
>>>
>>> I have to create stable algorithm for sorting n numbers from interval
>>> [1,n^2] with time complexity O(n) .
>>>
>>> Can someone please give me a hint. Would be very very thankful!

>> Unless I grossly miss out on something in computer science 101, the
>> lower bound for sorting is O(n * log_2 n). Which makes your task
>> impossible, unless there is something to be assumed about the
>> distribution of numbers in your sequence.
>>
>> Who has given you that assignment - a professor? Or some friend who's
>> playing tricks on you?
>>

> I think you must have fallen asleep during CS101. The lower bound for
> sorting where you make a two way branch at each step is O(n * log_2 n), but
> if you can choose between k possible orderings in a single comparison you
> can get O(n * log_k n).
>
> To beat n * log_2 n just use a bucket sort: for numbers with a known
> maximum you can sort them digit by digit for O(n log_k n), and if you don't
> restrict yourself to decimal then k can be as large as you want, so for the
> problem in question I think you can set k=n and (log_n n)==1 so you get
> O(n)

Thanks. I *totally* forgot about that.

Diez

Diez B. Roggisch, Dec 13, 2008

Re: stable algorithm with complexity O(n)

On Dec 13, 1:17 pm, Duncan Booth <> wrote:
> "Diez B. Roggisch" <> wrote:
>
>
>
> > David Hláèik schrieb:
> >> Hi guys,

>
> >> i am really sorry for making offtopic, hope you will not kill me, but
> >> this is for me life important problem which needs to be solved within
> >> next 12 hours..

>
> >> I have to create stable algorithm for sorting n numbers from interval
> >> [1,n^2] with time complexity O(n) .

>
> >> Can someone please give me a hint. Would be very very thankful!

>
> > Unless I grossly miss out on something in computer science 101, the
> > lower bound for sorting is O(n * log_2 n). Which makes your task
> > impossible, unless there is something to be assumed about the
> > distribution of numbers in your sequence.

>
> > Who has given you that assignment - a professor? Or some friend who's
> > playing tricks on you?

>
> I think you must have fallen asleep during CS101. The lower bound for
> sorting where you make a two way branch at each step is O(n * log_2 n), but
> if you can choose between k possible orderings in a single comparison you
> can get O(n * log_k n).
>
> To beat n * log_2 n just use a bucket sort: for numbers with a known
> maximum you can sort them digit by digit for O(n log_k n), and if you don't
> restrict yourself to decimal then k can be as large as you want, so for the
> problem in question I think you can set k=n and (log_n n)==1 so you get
> O(n)

Minor detail: with k= n, you have log_n (n^2)= 2, so you get O(2n)= O

The generic sort theorems also assume you can compare arbitrarily
large numbers in constant time, which isn't true. That is, for any
given machine, there exist numbers that you can't compare on them in
constant time. But with a known upper bound k, you can just use a k-
bit machine.

<rhetorical>So, what's the group policy on helping with homework? </
rhetorical>

7. ### Arnaud DelobelleGuest

"David HlÃ¡Äik" <> writes:

> Hi guys,
>
> i am really sorry for making offtopic, hope you will not kill me, but
> this is for me life important problem which needs to be solved within
> next 12 hours..
>
> I have to create stable algorithm for sorting n numbers from interval
> [1,n^2] with time complexity O(n) .
>
> Can someone please give me a hint. Would be very very thankful!
>
> D.

I don't have an algorithm but I have an implementation

def sort2(numbers):
"sort n positive integers in O(n) provided that they are all < n^2"
N = len(numbers)
units = [[] for i in xrange(N)]
tens = [[] for i in xrange(N)]
for n in numbers:
units[n % N].append(n)
for l in units:
for n in l:
tens[n / N].append(n)
return [n for l in tens for n in l]

--
Arnaud

Arnaud Delobelle, Dec 13, 2008
8. ### Steven D'ApranoGuest

On Sat, 13 Dec 2008 19:17:41 +0000, Duncan Booth wrote:

> I think you must have fallen asleep during CS101. The lower bound for
> sorting where you make a two way branch at each step is O(n * log_2 n),
> but if you can choose between k possible orderings in a single
> comparison you can get O(n * log_k n).

I think you might have been sleeping through Maths 101

The difference between log_2 N and log_k N is a constant factor (log_2 k)
and so doesn't effect the big-oh complexity.

--
Steven

Steven D'Aprano, Dec 14, 2008
9. ### Lev ElbertGuest

Re: stable algorithm with complexity O(n)

1. Comparison sorts have n*ln(n) complexity - does not do
2. Counting sort has the complexity O(d), where d is domain (in our
case n^2) - does not do.
3. Radix sorts have the complexity O(n*k), where k is number of bits
in integer. (32?) There are 2 variants:
a. most significant digit (MSD),
b. least significant digit (LSD).

The LSD radix sort is stable.

Good luck.

Lev Elbert, Dec 14, 2008
10. ### Arnaud DelobelleGuest

Steven D'Aprano <> writes:

> On Sat, 13 Dec 2008 19:17:41 +0000, Duncan Booth wrote:
>
>> I think you must have fallen asleep during CS101. The lower bound for
>> sorting where you make a two way branch at each step is O(n * log_2 n),
>> but if you can choose between k possible orderings in a single
>> comparison you can get O(n * log_k n).

>
> I think you might have been sleeping through Maths 101
>
> The difference between log_2 N and log_k N is a constant factor (log_2 k)
> and so doesn't effect the big-oh complexity.

It affects it if k is a function of n. In this particular example, we
can set k=n so we get O(n).

--
Arnaud

Arnaud Delobelle, Dec 14, 2008
11. ### David HláèikGuest

Thank you guys for help and support! My homework is done and waiting

Here it comes - bucket sort with time complexity O(n) == linear complexity

#! /usr/bin/python
def sort(numbers):
"sort n positive integers in O(n) provided that they are all from
interval [1, n^2]"
N = len(numbers) # get size of test numbers

buckets_mod = [[] for i in xrange(N)]
buckets_sorted = [[] for i in xrange(N+1)]

# group numbers to buckets (list of numbers) with common modulus
for n in numbers:
buckets_mod[n % N].append(n)
print "buckets_mod: %s" % buckets_mod

# check numbers in buckets
for l in buckets_mod:
for n in l:
# place number into bucket number grouped by result of division
buckets_sorted[n / N].append(n)
print "buckets_sorted: %s" % buckets_sorted

# search through sorted buckets and return list of sorted numbers
return [n for l in buckets_sorted for n in l]

Regards,

David

On Sun, Dec 14, 2008 at 11:24 AM, Arnaud Delobelle
<> wrote:
> Steven D'Aprano <> writes:
>
>> On Sat, 13 Dec 2008 19:17:41 +0000, Duncan Booth wrote:
>>
>>> I think you must have fallen asleep during CS101. The lower bound for
>>> sorting where you make a two way branch at each step is O(n * log_2 n),
>>> but if you can choose between k possible orderings in a single
>>> comparison you can get O(n * log_k n).

>>
>> I think you might have been sleeping through Maths 101
>>
>> The difference between log_2 N and log_k N is a constant factor (log_2 k)
>> and so doesn't effect the big-oh complexity.

>
> It affects it if k is a function of n. In this particular example, we
> can set k=n so we get O(n).
>
> --
> Arnaud
> --
> http://mail.python.org/mailman/listinfo/python-list
>

David Hláèik, Dec 14, 2008
12. ### Lie RyanGuest

On Sat, 13 Dec 2008 19:17:41 +0000, Duncan Booth wrote:

> "Diez B. Roggisch" <> wrote:
>
>> David HlÃƒÂ¡Ã„Âik schrieb:
>>> Hi guys,
>>>
>>> i am really sorry for making offtopic, hope you will not kill me, but
>>> this is for me life important problem which needs to be solved within
>>> next 12 hours..
>>>
>>> I have to create stable algorithm for sorting n numbers from interval
>>> [1,n^2] with time complexity O(n) .
>>>
>>> Can someone please give me a hint. Would be very very thankful!

>>
>> Unless I grossly miss out on something in computer science 101, the
>> lower bound for sorting is O(n * log_2 n). Which makes your task
>> impossible, unless there is something to be assumed about the
>> distribution of numbers in your sequence.
>>
>> Who has given you that assignment - a professor? Or some friend who's
>> playing tricks on you?
>>

> I think you must have fallen asleep during CS101. The lower bound for
> sorting where you make a two way branch at each step is O(n * log_2 n),
> but if you can choose between k possible orderings in a single
> comparison you can get O(n * log_k n).
>
> To beat n * log_2 n just use a bucket sort: for numbers with a known
> maximum you can sort them digit by digit for O(n log_k n), and if you
> don't restrict yourself to decimal then k can be as large as you want,
> so for the problem in question I think you can set k=n and (log_n n)==1
> so you get O(n)
>

I'm sure someday, there will be a student who comes to class late and
sees this on the board: "Design a comparison sorting algorithm that has
better than O(n * log n) lower bound complexity." The unsuspecting
student copied it, thinking it's a homework. He crunched to the problem,
going to various meditations and yoga classes before finding a way that
works just before deadline, handing out the work a bit late. Six weeks
later, his professor called and said: "You know what you just did? You've
just found a problem that was supposed to be an example of unsolvable
problem."

It has happened before, why not again?
http://www.snopes.com/college/homework/unsolvable.asp

Lie Ryan, Dec 14, 2008
13. ### gregGuest

Lie Ryan wrote:
> "You know what you just did? You've
> just found a problem that was supposed to be an example of unsolvable
> problem."
>
> It has happened before, why not again?

There's a big difference between an unsolvable problem and an
unsolved problem. In the cases you're talking about, nobody
there was no solution.

In the case at hand, there is a proof that such an algorithm
is impossible. Overturning that would require finding a
flaw in the proof, which for such a simple proof seems very
unlikely.

That's not to say nobody should try, but I won't be holding
my breath.

--
Greg

greg, Dec 15, 2008

Re: stable algorithm with complexity O(n)

On Dec 14, 6:04 pm, greg <> wrote:
> Lie Ryan wrote:
> > "You know what you just did? You've
> > just found a problem that was supposed to be an example of unsolvable
> > problem."

>
> > It has happened before, why not again?

>
> There's a big difference between an unsolvable problem and an
> unsolved problem. In the cases you're talking about, nobody
> there was no solution.
>
> In the case at hand, there is a proof that such an algorithm
> is impossible. Overturning that would require finding a
> flaw in the proof, which for such a simple proof seems very
> unlikely.

It's more likely you'd circumvent the assumptions. That is, find an
'outside-the-box' solution. For instance, a deeply parallel
architecture could escape the assumption that you can only compare two
numbers at a time (per step).

The proof's conclusion is still true if its assumption is.

15. ### Steven D'ApranoGuest

On Sun, 14 Dec 2008 21:42:33 +0000, Lie Ryan wrote:

> I'm sure someday, there will be a student who comes to class late and
> sees this on the board: "Design a comparison sorting algorithm that has
> better than O(n * log n) lower bound complexity." The unsuspecting
> student copied it, thinking it's a homework. He crunched to the problem,
> going to various meditations and yoga classes before finding a way that
> works just before deadline, handing out the work a bit late. Six weeks
> later, his professor called and said: "You know what you just did?
> You've just found a problem that was supposed to be an example of
> unsolvable problem."
>
> It has happened before, why not again?
> http://www.snopes.com/college/homework/unsolvable.asp

Because as you described it, it *hasn't* happened before. There is the
world of difference between an unsolvABLE problem and one that is
unsolvED.

* make a four-sided triangle;

* write down a completely accurate rational expansion for pi or the
square-root of 2;

* split a magnet into two individual poles;

* find an integer solution to 3*n = 7;

* solve the Halting Problem;

* fit n items into n-1 pigeonholes without putting two items in the one
pigeonhole;

* create a compression algorithm that will compress random data;

* or design a comparison sort which does fewer than O(n*log n) two-way
comparisons in the worst case, or fewer than O(n) comparisons in the best
case.

Some things really don't have a solution, no matter how much power of
positive thinking you apply to it.

--
Steven

Steven D'Aprano, Dec 15, 2008
16. ### Roy SmithGuest

Steven D'Aprano <> wrote:

>
> * make a four-sided triangle;
>
> * split a magnet into two individual poles;

These two are fundamentally different problems.

The first is impossible by definition. The definition of triangle is, "a
for "a value of three which is equal to four".

The second is only "impossible" because it contradicts our understanding
(based on observation) of how the physical universe works. Our
understanding could simply be wrong. We've certainly been wrong before,
and we will undoubtedly be proven wrong again in the future. When it comes
to things like electromagnetic theory, it doesn't take too many steps to
get us to the fuzzy edge of quantum physics where we know there are huge

Roy Smith, Dec 15, 2008

Re: stable algorithm with complexity O(n)

On Dec 14, 8:18 pm, Roy Smith <> wrote:
> Steven D'Aprano <> wrote:
> > All the positive thinking in the world won't help you:

>
> > * make a four-sided triangle;

>
> > * split a magnet into two individual poles;

>
> These two are fundamentally different problems.
>
> The first is impossible by definition.  The definition of triangle is, "a
> three-sided polygon".  Asking for a "four-sided triangle" is akin to asking
> for "a value of three which is equal to four".
>
> The second is only "impossible" because it contradicts our understanding
> (based on observation) of how the physical universe works.  Our
> understanding could simply be wrong.  We've certainly been wrong before,
> and we will undoubtedly be proven wrong again in the future.  When it comes
> to things like electromagnetic theory, it doesn't take too many steps to
> get us to the fuzzy edge of quantum physics where we know there are huge
> questions yet to be answered.

I agree. Most of his examples were tautologies. The magnet one was
the exception.

Then, to beat the O( n lg n ) limit, just break an assumption.

> > * or design a comparison sort which does fewer than O(n*log n) two-way
> > comparisons in the worst case, or fewer than O(n) comparisons in the best
> > case.

Make a three-way comparison, make a non-comparison sort, or make non-
random inputs.

18. ### David CournapeauGuest

Re: stable algorithm with complexity O(n)

On Mon, Dec 15, 2008 at 2:12 PM, Aaron Brady <> wrote:

>
> I agree. Most of his examples were tautologies. The magnet one was
> the exception.

A proof is nothing more than a tautology The fact that pi is not
rational is not trivial (but certainly has been proved for some time
now).

cheers,

David

David Cournapeau, Dec 15, 2008
19. ### Steven D'ApranoGuest

On Sun, 14 Dec 2008 21:18:03 -0500, Roy Smith wrote:

> Steven D'Aprano <> wrote:
>
>>
>> * make a four-sided triangle;
>>
>> * split a magnet into two individual poles;

>
> These two are fundamentally different problems.
>
> The first is impossible by definition. The definition of triangle is,
> "a three-sided polygon". Asking for a "four-sided triangle" is akin to
> asking for "a value of three which is equal to four".

That's right. But see below.

> The second is only "impossible" because it contradicts our understanding
> (based on observation) of how the physical universe works. Our
> understanding could simply be wrong.

And arithmetic could be inconsistent, in which case it might be possible
to prove that 3 equals 4. We don't know for sure that arithmetic is
consistent, and according to Godel, there is no way of proving that it is
consistent. There's no evidence that it isn't, but then, unless the
inconsistency was obvious, how would we know?

http://www.mathpages.com/home/kmath347/kmath347.htm

> We've certainly been wrong before,
> and we will undoubtedly be proven wrong again in the future. When it
> comes to things like electromagnetic theory, it doesn't take too many
> steps to get us to the fuzzy edge of quantum physics where we know there
> are huge questions yet to be answered.

No. I worded my question very carefully. The discovery of magnetic
monopoles, as predicted by the fuzzy end of quantum physics, would not
invalidate my claim. Magnets don't generate magnetic fields by the use of
monopoles, and the discovery of such wouldn't make it possible to cut an
ordinary magnet in two to get an individual north and south pole. That
would like taking a rope with two ends (an ordinary rope, in other
words), cut it in half, and finding that each piece has only a single end.

Now, you could counter with a clever solution involving splicing the rope
to itself in such a way that it had one end and a loop at the other, er,
end. And such a solution might be very valuable, if we needed a way to
get a rope with a loop at one end. But it isn't solving the problem of
cutting a rope in two and getting only two ends instead of four. It's
solving a different problem.

--
Steven

Steven D'Aprano, Dec 15, 2008
20. ### Lie RyanGuest

On Mon, 15 Dec 2008 01:48:43 +0000, Steven D'Aprano wrote:

> Some things really don't have a solution, no matter how much power of
> positive thinking you apply to it.

Some may, only not with the current understanding of the universe. Well,
I agree that there are a few things that is straight out nonsense, such
as 4-angled triangle.

And actually you've got the wrong message from the story, I hadn't intend
it to be about positive thinking (as most motivators would exploit), it's
more about thinking outside the current popular frame of minds. When
you're not bound by everyone around you that says a task is impossible,
you might have a better chance to solve the task (well, as long as the

What follows is a not-so-funny joke.

Q: Construct a 4-angled triangle.
A: Hmmm.... let's see...

/\
/ \
/____\

Q: That's just a regular triangle

_____
| |
| |
|_____|

Q: That's just a regular rectangle.
A: This?

/|\
/ | \
/__|__\

Q: Well, still no luck, it's a triangle, but it have 5 sides
A: Well... (after thinking for a moment)

/----\
/ \
/ \
/ \
-----------------/ \-------\
----------------------------------------

Q: Perfect, just like what I wanted, but you shouldn't let the Python eat
it.

-- Le Petit Prince & aptitude & me

Lie Ryan, Dec 15, 2008