Velocity Reviews > Complexity of data structure

# Complexity of data structure

C learner
Guest
Posts: n/a

 04-24-2011
Why calculation of complexity of various algorithms(Linear search,
bubble sort) confined to number of comparisons, whereas the arithmetic
operations and other operations are not considered, though these also
may eat significant processing power.

Ian Collins
Guest
Posts: n/a

 04-24-2011
On 04/24/11 07:13 PM, C learner wrote:
> Why calculation of complexity of various algorithms(Linear search,
> bubble sort) confined to number of comparisons, whereas the arithmetic
> operations and other operations are not considered, though these also
> may eat significant processing power.

Because the arithmetic operations are common to all algorithms.

--
Ian Collins

Eric Sosman
Guest
Posts: n/a

 04-24-2011
On 4/24/2011 3:13 AM, C learner wrote:
> Why calculation of complexity of various algorithms(Linear search,
> bubble sort) confined to number of comparisons, whereas the arithmetic
> operations and other operations are not considered, though these also
> may eat significant processing power.

any particular programming language, and probably belongs in a forum
like comp.programming.

... but for what it's worth, try the analysis yourself. Take
some simple algorithm, implement it, study the machine instructions
that it generates, and try to predict how much time it will take.
Don't forget to take account of pipeline parallelism, cache hits
and misses, translation look-aside buffer hits and misses, ... It
will be a difficult job, but perhaps you can get an answer after a
great deal of labor. And then you'll have an answer -- which will
go straight out the window as soon as you install a new compiler
version or change the compilation flags, or even add RAM. In other
words, all that enormous effort will produce, at best, an answer
that you can use only once and cannot transfer to the next machine.

--
Eric Sosman
http://www.velocityreviews.com/forums/(E-Mail Removed)d

luser- -droog
Guest
Posts: n/a

 04-25-2011
On Apr 24, 2:13*am, C learner <(E-Mail Removed)> wrote:
> Why calculation of complexity of various algorithms(Linear search,
> bubble sort) confined to number of comparisons, whereas the arithmetic
> operations and other operations are not considered, though these also
> may eat significant processing power.

vol. 1 of The Art of Programming by Don Knuth would be a good resource
for learning how to do this analysis on an idealized machine
architecture. Used copies (2nd Edition) are super cheap.

Gene
Guest
Posts: n/a

 04-25-2011
On Apr 24, 3:13*am, C learner <(E-Mail Removed)> wrote:
> Why calculation of complexity of various algorithms(Linear search,
> bubble sort) confined to number of comparisons, whereas the arithmetic
> operations and other operations are not considered, though these also
> may eat significant processing power.

You probably aren't reading the right sources. Analyzing comparisons
only is a convenience for the analyst. It lets him/her avoid defining
a model of computation. The embedded assumption is that the rest of
the computation will have run time proportional to that number, but
this may not be true as some have already pointed out.

A good algorithms text will begin by defining a model of computation
and then go on to define run time as the number of execution steps in
that model. Usually it's a Real RAM model. IIRC this is defined
early in Aho, Hopcroft and Ulman, for example.