Velocity Reviews > That interesting notation used to describe how long a loop willtake.

# That interesting notation used to describe how long a loop willtake.

Tobiah
Guest
Posts: n/a

 10-04-2010
It gets used here frequently, but not
having majored in programming, I'm not
familiar with it. One might say:

Don't do it that way, it will result in O(n**2)!

Or something like that. I read this to mean
that the execution time varies with the square
of the number of iterations, or items being sorted
etc..

I want to google this, but I'm not sure what
keywords to use. Is there a wikipedia article about this
subject? I imagine that it has a concise name.

Thanks,

Tobiah

Ian
Guest
Posts: n/a

 10-04-2010
On Oct 4, 12:38*pm, Tobiah <(E-Mail Removed)> wrote:
> It gets used here frequently, but not
> having majored in programming, I'm not
> familiar with it. *One might say:
>
> Don't do it that way, it will result in O(n**2)!
>
> Or something like that. *I read this to mean
> that the execution time varies with the square
> of the number of iterations, or items being sorted
> etc..
>
> I want to google this, but I'm not sure what
> keywords to use. *Is there a wikipedia article about this
> subject? *I imagine that it has a concise name.

"Big O notation"

Cheers,
Ian

Matteo Landi
Guest
Posts: n/a

 10-04-2010
Here you are:

http://en.wikipedia.org/wiki/Big_O_notation

Best regards,
Matteo

On Mon, Oct 4, 2010 at 8:38 PM, Tobiah <(E-Mail Removed)> wrote:
> It gets used here frequently, but not
> having majored in programming, I'm not
> familiar with it. *One might say:
>
> Don't do it that way, it will result in O(n**2)!
>
> Or something like that. *I read this to mean
> that the execution time varies with the square
> of the number of iterations, or items being sorted
> etc..
>
> I want to google this, but I'm not sure what
> keywords to use. *Is there a wikipedia article about this
> subject? *I imagine that it has a concise name.
>
> Thanks,
>
> Tobiah
> --
> http://mail.python.org/mailman/listinfo/python-list
>

--
Matteo Landi
http://www.matteolandi.net/

MRAB
Guest
Posts: n/a

 10-04-2010
On 04/10/2010 19:38, Tobiah wrote:
> It gets used here frequently, but not
> having majored in programming, I'm not
> familiar with it. One might say:
>
> Don't do it that way, it will result in O(n**2)!
>
> Or something like that. I read this to mean
> that the execution time varies with the square
> of the number of iterations, or items being sorted
> etc..
>
> I want to google this, but I'm not sure what
> keywords to use. Is there a wikipedia article about this
> subject? I imagine that it has a concise name.
>

It's called the "Big O notation".

Neil Cerutti
Guest
Posts: n/a

 10-04-2010
On 2010-10-04, MRAB <(E-Mail Removed)> wrote:
> On 04/10/2010 19:38, Tobiah wrote:
>> It gets used here frequently, but not
>> having majored in programming, I'm not
>> familiar with it. One might say:
>>
>> Don't do it that way, it will result in O(n**2)!
>>
>> Or something like that. I read this to mean
>> that the execution time varies with the square
>> of the number of iterations, or items being sorted
>> etc..
>>
>> I want to google this, but I'm not sure what
>> keywords to use. Is there a wikipedia article about this
>> subject? I imagine that it has a concise name.
>>

> It's called the "Big O notation".

The web version of the book "Data Structures and Algorithms with
Object-Oriented Design Patterns in Python" contains a an
explanation in the chapters Algorithm Analysis and Asymptotic
Notation.

http://www.brpreiss.com/books/opus7/

--
Neil Cerutti

Chris Rebert
Guest
Posts: n/a

 10-04-2010
> On Tue, Oct 5, 2010 at 12:08 AM, Tobiah <(E-Mail Removed)> wrote:
>> It gets used here frequently, but not
>> having majored in programming, I'm not
>> familiar with it. Â*One might say:
>>
>> Don't do it that way, it will result in O(n**2)!
>>
>> Or something like that. Â*I read this to mean
>> that the execution time varies with the square
>> of the number of iterations, or items being sorted
>> etc..
>>
>> I want to google this, but I'm not sure what
>> keywords to use. Â*Is there a wikipedia article about this
>> subject? Â*I imagine that it has a concise name.

On Mon, Oct 4, 2010 at 11:58 AM, Shashank Singh
<(E-Mail Removed)> wrote:
> this might help:
>
> http://en.wikipedia.org/wiki/Analysis_of_algorithms

http://en.wikipedia.org/wiki/Time_complexity

Cheers,
Chris

Roy Smith
Guest
Posts: n/a

 10-05-2010
In article <Kapqo.16572\$(E-Mail Removed)>,
Tobiah <(E-Mail Removed)> wrote:

> Don't do it that way, it will result in O(n**2)!
> [...]
> I want to google this, but I'm not sure what
> keywords to use. Is there a wikipedia article about this
> subject? I imagine that it has a concise name.

Google for "Big-O notation". Depending on your level of interest,
expect to spend anywhere from an hour to the next four years reading
what pops out

Edward A. Falk
Guest
Posts: n/a

 10-05-2010
In article <(E-Mail Removed)>,
Roy Smith <(E-Mail Removed)> wrote:
>In article <Kapqo.16572\$(E-Mail Removed)>,
> Tobiah <(E-Mail Removed)> wrote:
>
>Google for "Big-O notation". Depending on your level of interest,
>expect to spend anywhere from an hour to the next four years reading
>what pops out

Yeah, that's my problem with Wikipedia too. Plus, they like to just
roll up their sleeves and dive right into the math. It's like a bucket
of ice water to the face if you're a mathematical layman.

Tobiah, for the purposes of 99% of the work you'll be doing in computing,
you don't need all that math. Just think of O(foo) as meaning "On the
order of foo". This means basically that you evaluate foo, and the time
your algorithm takes to execute is proportional to that.

So for example, O(n^2) means that the time the algorithm takes to run
is roughly on the order of n^2 where n is the size of your data set.

A good example is a simple sort algorithm which runs in O(n^2), meaning
that if you double the number of points, you quadruple the time it takes
to sort them. A better sorting algorithm runs in O(n*log(n)).

The best example is the quadratic sieve* for factoring large numbers,
which runs in O(exp( n^(1/2) (log n)^(1/2) )). I was at a party
celebrating the expiration of the RSA patent and someone (I think it
was Lucky Green) went to the white board, wrote down this expression
and explained that this term meant that the program ran in worse than
polynomial time, but this other term meant that at it least ran in better
than exponential time. Meaning that the algorithm ran in "superpolynomial
subexponential runtime".

This led to the really silly song documented here:
http://www.xent.com/FoRK-archive/oct00/0429.html

(*Yes, yes, I know they were talking about the polynomial sieve, but
I couldn't find the runtime for that.)

--
-Ed Falk, http://www.velocityreviews.com/forums/(E-Mail Removed)
http://thespamdiaries.blogspot.com/

Lawrence D'Oliveiro
Guest
Posts: n/a

 10-05-2010
In message <i8dsu7\$nmg\$(E-Mail Removed)>, Edward A. Falk wrote:

> In article <(E-Mail Removed)>,
> Roy Smith <(E-Mail Removed)> wrote:
>
>>Google for "Big-O notation". Depending on your level of interest,
>>expect to spend anywhere from an hour to the next four years reading
>>what pops out

>
> Yeah, that's my problem with Wikipedia too.

Which part of â€ślevel of interestâ€ť didnâ€™t you understand?

Niklasro
Guest
Posts: n/a

 10-05-2010
On 4 Okt, 20:38, Tobiah <(E-Mail Removed)> wrote:
> It gets used here frequently, but not
> having majored in programming, I'm not
> familiar with it. *One might say:
>
> Don't do it that way, it will result in O(n**2)!
>
> Or something like that. *I read this to mean
> that the execution time varies with the square
> of the number of iterations, or items being sorted
> etc..
>
> I want to google this, but I'm not sure what
> keywords to use. *Is there a wikipedia article about this
> subject? *I imagine that it has a concise name.
>
> Thanks,
>
> Tobiah

Big O relates input size to computation time. For example mission is
"make the program 1000 times faster"
you can
a) buy much more expensive hardware
b) rewrite exponential time algorithms to polynomial time and
likewise.
And look at a sheet with 10 best algorithms, they have times noted and
where applicable for example sorting a set that's already somewhat
sorted another method than sorting a very unsorted set is applicable.
Regards,
Niklas