Velocity Reviews > Which is faster?

# Which is faster?

cnb
Guest
Posts: n/a

 08-30-2008
For a big nbr of it might matter?
Is av_grade O(n*2) and the first O(n) when it comes to adding or is
"sum x for x in y" just traversing the list ones, accumulating the
values, it doesnt first build the list and then travese it for sum?

tot = 0
for review in self.reviews:

return sum(review.grade for review in self.reviews) / \
len(self.reviews)

Fredrik Lundh
Guest
Posts: n/a

 08-30-2008
cnb wrote:

> For a big nbr of it might matter?
> Is av_grade O(n*2) and the first O(n) when it comes to adding or is
> "sum x for x in y" just traversing the list ones, accumulating the
> values, it doesnt first build the list and then travese it for sum?

sum() doesn't build a list, but even if it would do that, it'd still be
O(n) -- looping over an array twice doesn't change the time complexity.

to find out which one's actually faster, benchmark them.

</F>

Gabriel Genellina
Guest
Posts: n/a

 08-30-2008
En Sat, 30 Aug 2008 01:26:35 -0300, cnb <(E-Mail Removed)> escribi�:

> For a big nbr of it might matter?
> Is av_grade O(n*2) and the first O(n) when it comes to adding or is
> "sum x for x in y" just traversing the list ones, accumulating the
> values, it doesnt first build the list and then travese it for sum?

AFAIK both versions are O(n).
sum(x for x in y) contains a "generator expression" [1], it does not
create an intermediate list.
sum([x for x in y]) contains a "list comprehension" [2] and it does create
an intermediate list.

I'd say the version using sum() should be faster, because the iteration is
implemented inside the interpreter, not in Python code as the other one.
But you should actually measure their relative performance; use the timeit
module for that.

> tot = 0
> for review in self.reviews:
>
> return sum(review.grade for review in self.reviews) / \
> len(self.reviews)

[1] http://docs.python.org/tut/node11.ht...00000000000000
[2] http://docs.python.org/tut/node7.html
--
Gabriel Genellina

Steven D'Aprano
Guest
Posts: n/a

 08-30-2008
On Fri, 29 Aug 2008 21:26:35 -0700, cnb wrote:

> tot = 0
> for review in self.reviews:
>
> return sum(review.grade for review in self.reviews) / \
> len(self.reviews)

Re-writing the functions so they can be tested alone:

tot = 0.0
for x in alist:
tot += x

return sum(alist)/len(alist)

>>> from timeit import Timer
>>> # small amount of items

.... alist = range(100)

.... 'from __main__ import alist, averageGrade').repeat(number=100000)
[3.9559240341186523, 3.4910569190979004, 3.4856188297271729]
>>>

.... 'from __main__ import alist, av_grade').repeat(number=100000)
[2.0255107879638672, 1.0968310832977295, 1.0733180046081543]

The version with sum() is much faster. How about with lots of data?

>>> alist = xrange(1000000)

.... 'from __main__ import alist, averageGrade').repeat(number=50)
[17.699107885360718, 18.182793140411377, 18.651514053344727]
>>>

.... 'from __main__ import alist, av_grade').repeat(number=50)
[17.125216007232666, 15.72636890411377, 16.309713840484619]

sum() is still a little faster.

--
Steven

Gabriel Genellina
Guest
Posts: n/a

 08-30-2008
En Sat, 30 Aug 2008 03:15:30 -0300, Steven D'Aprano
<(E-Mail Removed)> escribi�:

> On Fri, 29 Aug 2008 21:26:35 -0700, cnb wrote:
>
>> tot = 0
>> for review in self.reviews:
>>
>> return sum(review.grade for review in self.reviews) / \
>> len(self.reviews)

>
> Re-writing the functions so they can be tested alone:
>
> tot = 0.0
> for x in alist:
> tot += x
>
>
> return sum(alist)/len(alist)
>
>
>>>> from timeit import Timer
>>>> # small amount of items

> ... alist = range(100)

> ... 'from __main__ import alist, averageGrade').repeat(number=100000)
> [3.9559240341186523, 3.4910569190979004, 3.4856188297271729]
>>>>

> ... 'from __main__ import alist, av_grade').repeat(number=100000)
> [2.0255107879638672, 1.0968310832977295, 1.0733180046081543]
>
>
> The version with sum() is much faster. How about with lots of data?
>
>>>> alist = xrange(1000000)

> ... 'from __main__ import alist, averageGrade').repeat(number=50)
> [17.699107885360718, 18.182793140411377, 18.651514053344727]
>>>>

> ... 'from __main__ import alist, av_grade').repeat(number=50)
> [17.125216007232666, 15.72636890411377, 16.309713840484619]
>
> sum() is still a little faster.

Mmm, in this last test you're measuring the long integer operations
performance (because the sum exceeds largely what can be represented in a
plain integer). Long integers are so slow that the difference between both
loops becomes negligible.

I've tried again using float values:
alist = [float(x) for x in xrange(1000000)]
and got consistent results for any input size (the version using sum() is
about twice as fast as the for loop)

--
Gabriel Genellina

cnb
Guest
Posts: n/a

 08-30-2008
how does doing something twice not change complexity? yes it maybe
belongs to the same complexity-class but is still twice as slow no?

Fredrik Lundh
Guest
Posts: n/a

 08-30-2008
cnb wrote:

> how does doing something twice not change complexity? yes it maybe
> belongs to the same complexity-class but is still twice as slow no?

doing two things quickly can still be faster than doing one thing slowly.

</F>

Diez B. Roggisch
Guest
Posts: n/a

 08-30-2008
cnb schrieb:
> how does doing something twice not change complexity? yes it maybe
> belongs to the same complexity-class but is still twice as slow no?

Because big O notation is not about constant factors. Or even subterms
with lower powers.

http://en.wikipedia.org/wiki/Big_O_notation

Diez

Lie
Guest
Posts: n/a

 08-30-2008
On Aug 30, 5:30*pm, cnb <(E-Mail Removed)> wrote:
> how does doing something twice not change complexity? yes it maybe
> belongs to the same complexity-class but is still twice as slow no?

Who is doing something twice? Definitely not sum().

sum() does not create intermediate list, and if you pass generator
expressions in it you wouldn't make any intermediate list at all, thus
simple looping but in interpreter code. sum() that is passed a list
comprehension should be faster for extremely small numbers of values
to sum, but we don't care about small things, do we? But using
intermediate list should be fast enough for even large numbers of
values, the time when it can't cope anymore would be when the
intermediate list takes half of your memory (how often is that for
regular applications?).

Fredrik Lundh
Guest
Posts: n/a

 08-30-2008
Lie wrote:

>> how does doing something twice not change complexity? yes it maybe
>> belongs to the same complexity-class but is still twice as slow no?

>
> Who is doing something twice? Definitely not sum().

nobody's claiming that -- but as mentioned above, even if sum() had done
two passes over the source sequence, it'd still be O(n).

</F>