# That interesting notation used to describe how long a loop willtake.

Discussion in 'Python' started by Tobiah, Oct 4, 2010.

1. ### TobiahGuest

It gets used here frequently, but not
having majored in programming, I'm not
familiar with it. One might say:

Don't do it that way, it will result in O(n**2)!

Or something like that. I read this to mean
that the execution time varies with the square
of the number of iterations, or items being sorted
etc..

I want to google this, but I'm not sure what
subject? I imagine that it has a concise name.

Thanks,

Tobiah

Tobiah, Oct 4, 2010

2. ### IanGuest

Re: That interesting notation used to describe how long a loop will take.

On Oct 4, 12:38 pm, Tobiah <> wrote:
> It gets used here frequently, but not
> having majored in programming, I'm not
> familiar with it.  One might say:
>
> Don't do it that way, it will result in O(n**2)!
>
> Or something like that.  I read this to mean
> that the execution time varies with the square
> of the number of iterations, or items being sorted
> etc..
>
> I want to google this, but I'm not sure what
> subject?  I imagine that it has a concise name.

"Big O notation"

Cheers,
Ian

Ian, Oct 4, 2010

3. ### Matteo LandiGuest

Here you are:

http://en.wikipedia.org/wiki/Big_O_notation

Best regards,
Matteo

On Mon, Oct 4, 2010 at 8:38 PM, Tobiah <> wrote:
> It gets used here frequently, but not
> having majored in programming, I'm not
> familiar with it.  One might say:
>
> Don't do it that way, it will result in O(n**2)!
>
> Or something like that.  I read this to mean
> that the execution time varies with the square
> of the number of iterations, or items being sorted
> etc..
>
> I want to google this, but I'm not sure what
> subject?  I imagine that it has a concise name.
>
> Thanks,
>
> Tobiah
> --
> http://mail.python.org/mailman/listinfo/python-list
>

--
Matteo Landi
http://www.matteolandi.net/

Matteo Landi, Oct 4, 2010
4. ### MRABGuest

On 04/10/2010 19:38, Tobiah wrote:
> It gets used here frequently, but not
> having majored in programming, I'm not
> familiar with it. One might say:
>
> Don't do it that way, it will result in O(n**2)!
>
> Or something like that. I read this to mean
> that the execution time varies with the square
> of the number of iterations, or items being sorted
> etc..
>
> I want to google this, but I'm not sure what
> subject? I imagine that it has a concise name.
>

It's called the "Big O notation".

MRAB, Oct 4, 2010
5. ### Neil CeruttiGuest

On 2010-10-04, MRAB <> wrote:
> On 04/10/2010 19:38, Tobiah wrote:
>> It gets used here frequently, but not
>> having majored in programming, I'm not
>> familiar with it. One might say:
>>
>> Don't do it that way, it will result in O(n**2)!
>>
>> Or something like that. I read this to mean
>> that the execution time varies with the square
>> of the number of iterations, or items being sorted
>> etc..
>>
>> I want to google this, but I'm not sure what
>> subject? I imagine that it has a concise name.
>>

> It's called the "Big O notation".

The web version of the book "Data Structures and Algorithms with
Object-Oriented Design Patterns in Python" contains a an
explanation in the chapters Algorithm Analysis and Asymptotic
Notation.

http://www.brpreiss.com/books/opus7/

--
Neil Cerutti

Neil Cerutti, Oct 4, 2010
6. ### Chris RebertGuest

> On Tue, Oct 5, 2010 at 12:08 AM, Tobiah <> wrote:
>> It gets used here frequently, but not
>> having majored in programming, I'm not
>> familiar with it. Â One might say:
>>
>> Don't do it that way, it will result in O(n**2)!
>>
>> Or something like that. Â I read this to mean
>> that the execution time varies with the square
>> of the number of iterations, or items being sorted
>> etc..
>>
>> I want to google this, but I'm not sure what
>> subject? Â I imagine that it has a concise name.

On Mon, Oct 4, 2010 at 11:58 AM, Shashank Singh
<> wrote:
> this might help:
>
> http://en.wikipedia.org/wiki/Analysis_of_algorithms

http://en.wikipedia.org/wiki/Time_complexity

Cheers,
Chris

Chris Rebert, Oct 4, 2010
7. ### Roy SmithGuest

Re: That interesting notation used to describe how long a loop will take.

In article <Kapqo.16572\$>,
Tobiah <> wrote:

> Don't do it that way, it will result in O(n**2)!
> [...]
> I want to google this, but I'm not sure what
> subject? I imagine that it has a concise name.

expect to spend anywhere from an hour to the next four years reading
what pops out

Roy Smith, Oct 5, 2010
8. ### Edward A. FalkGuest

Re: That interesting notation used to describe how long a loop will take.

In article <>,
Roy Smith <> wrote:
>In article <Kapqo.16572\$>,
> Tobiah <> wrote:
>
>expect to spend anywhere from an hour to the next four years reading
>what pops out

Yeah, that's my problem with Wikipedia too. Plus, they like to just
roll up their sleeves and dive right into the math. It's like a bucket
of ice water to the face if you're a mathematical layman.

Tobiah, for the purposes of 99% of the work you'll be doing in computing,
you don't need all that math. Just think of O(foo) as meaning "On the
order of foo". This means basically that you evaluate foo, and the time
your algorithm takes to execute is proportional to that.

So for example, O(n^2) means that the time the algorithm takes to run
is roughly on the order of n^2 where n is the size of your data set.

A good example is a simple sort algorithm which runs in O(n^2), meaning
that if you double the number of points, you quadruple the time it takes
to sort them. A better sorting algorithm runs in O(n*log(n)).

The best example is the quadratic sieve* for factoring large numbers,
which runs in O(exp( n^(1/2) (log n)^(1/2) )). I was at a party
celebrating the expiration of the RSA patent and someone (I think it
was Lucky Green) went to the white board, wrote down this expression
and explained that this term meant that the program ran in worse than
polynomial time, but this other term meant that at it least ran in better
than exponential time. Meaning that the algorithm ran in "superpolynomial
subexponential runtime".

This led to the really silly song documented here:
http://www.xent.com/FoRK-archive/oct00/0429.html

(*Yes, yes, I know they were talking about the polynomial sieve, but
I couldn't find the runtime for that.)

--
-Ed Falk,
http://thespamdiaries.blogspot.com/

Edward A. Falk, Oct 5, 2010
9. ### Lawrence D'OliveiroGuest

Re: That interesting notation used to describe how long a loop will take.

In message <i8dsu7\$nmg\$>, Edward A. Falk wrote:

> In article <>,
> Roy Smith <> wrote:
>
>>expect to spend anywhere from an hour to the next four years reading
>>what pops out

>
> Yeah, that's my problem with Wikipedia too.

Which part of â€œlevel of interestâ€ didnâ€™t you understand?

Lawrence D'Oliveiro, Oct 5, 2010
10. ### NiklasroGuest

Re: That interesting notation used to describe how long a loop will take.

On 4 Okt, 20:38, Tobiah <> wrote:
> It gets used here frequently, but not
> having majored in programming, I'm not
> familiar with it.  One might say:
>
> Don't do it that way, it will result in O(n**2)!
>
> Or something like that.  I read this to mean
> that the execution time varies with the square
> of the number of iterations, or items being sorted
> etc..
>
> I want to google this, but I'm not sure what
> subject?  I imagine that it has a concise name.
>
> Thanks,
>
> Tobiah

Big O relates input size to computation time. For example mission is
"make the program 1000 times faster"
you can
a) buy much more expensive hardware
b) rewrite exponential time algorithms to polynomial time and
likewise.
And look at a sheet with 10 best algorithms, they have times noted and
where applicable for example sorting a set that's already somewhat
sorted another method than sorting a very unsorted set is applicable.
Regards,
Niklas

Niklasro, Oct 5, 2010