Velocity Reviews - Computer Hardware Reviews

Velocity Reviews > Newsgroups > Programming > C++ > Why next/prev for iterators in C++0x?

Reply
Thread Tools

Why next/prev for iterators in C++0x?

 
 
Jorgen Grahn
Guest
Posts: n/a
 
      04-27-2011
On Mon, 2011-04-25, James Kanze wrote:
> On Apr 24, 9:02 am, "crea" <(E-Mail Removed)> wrote:
>> "Howard Hinnant" <(E-Mail Removed)> wrote in message

>
>> news:(E-Mail Removed)...
>> On Apr 23, 5:53 pm, Marc <(E-Mail Removed)> wrote:

>
>> " ...
>> for (ForwardIterator1 j = std::next(i); j != last1; ++j)
>> if (pred(*i, *j))
>> ++c1;
>> "

>
>> Just a small comment: I just read an article to say , that we
>> dont really (always) need to do ++j, but can do j++. The
>> reason is that the computer can trim the code many times when
>> it compiles. Also testing showed that there was no big
>> difference even if j++ was used. But I have to read the
>> article again...
>> On the other hand, I dont know why using ++j would be not good.

>
> In practice, there's no valid technical reason for preferring
> one over the other---it's just a question of which one you like
> best. Politically speaking, however... some noted authors have
> claimed otherwise, and have influences a large number of
> programmers; it's easier to just use ++j than to argue with
> them. (FWIW: K&R favored j++, and at least in the earlier
> versions of his books, so did Stroustrup, so for older
> programmers, who learned from the original masters, j++ often
> seems more natural. Just because we've seen it more often,
> however; not for any technical reason.)


So what about the argument that j++ can be a lot slower for
user-defined overloadings? Is that just politics, no longer true,
irrelevant, or ... ?

/Jorgen

--
// Jorgen Grahn <grahn@ Oo o. . .
\X/ snipabacken.se> O o .
 
Reply With Quote
 
 
 
 
Kai-Uwe Bux
Guest
Posts: n/a
 
      04-27-2011
Jorgen Grahn wrote:

> On Mon, 2011-04-25, James Kanze wrote:
>> On Apr 24, 9:02 am, "crea" <(E-Mail Removed)> wrote:
>>> "Howard Hinnant" <(E-Mail Removed)> wrote in message

>>
>>> news:c3808703-28c5-457e-

http://www.velocityreviews.com/forums/(E-Mail Removed)...
>>> On Apr 23, 5:53 pm, Marc <(E-Mail Removed)> wrote:

>>
>>> " ...
>>> for (ForwardIterator1 j = std::next(i); j != last1; ++j)
>>> if (pred(*i, *j))
>>> ++c1;
>>> "

>>
>>> Just a small comment: I just read an article to say , that we
>>> dont really (always) need to do ++j, but can do j++. The
>>> reason is that the computer can trim the code many times when
>>> it compiles. Also testing showed that there was no big
>>> difference even if j++ was used. But I have to read the
>>> article again...
>>> On the other hand, I dont know why using ++j would be not good.

>>
>> In practice, there's no valid technical reason for preferring
>> one over the other---it's just a question of which one you like
>> best. Politically speaking, however... some noted authors have
>> claimed otherwise, and have influences a large number of
>> programmers; it's easier to just use ++j than to argue with
>> them. (FWIW: K&R favored j++, and at least in the earlier
>> versions of his books, so did Stroustrup, so for older
>> programmers, who learned from the original masters, j++ often
>> seems more natural. Just because we've seen it more often,
>> however; not for any technical reason.)

>
> So what about the argument that j++ can be a lot slower for
> user-defined overloadings? Is that just politics, no longer true,
> irrelevant, or ... ?


To me, that argument _always_ sounded like premature optimization. I wonder
whether it has ever been substantiated by measurements. I also wonder if
there are any _recent_ measurement supporting that notion: after all,
improvements in compiler technology may have rendered this worry obsolete.
(And I would not be surprised if those improvements have been around for
some 30 years


Best,

Kai-Uwe Bux
 
Reply With Quote
 
 
 
 
SG
Guest
Posts: n/a
 
      04-27-2011
On 27 Apr., 10:04, Kai-Uwe Bux wrote:
> Jorgen Grahn wrote:
> > > [...]

> > So what about the argument that j++ can be a lot slower for
> > user-defined overloadings? *Is that just politics, no longer true,
> > irrelevant, or ... ?

>
> To me, that argument _always_ sounded like premature optimization.


Depends on how you define premature optimization. I have no problem
with writing "++j" instead of "j++" in the first place. It doesn't
take _any_ more effort, obviously. And since I spend no extra time on
this "optimization" it also does not have to pay off. So, in the worst
case I lost nothing and gained nothing. For me "premature
optimization" implies that the time one spends optimizing will not pay
off.

> I also wonder if
> there are any _recent_ measurement supporting that notion: after all,
> improvements in compiler technology may have rendered this worry obsolete..


In case of a UDT that is trivially copyable and has an inline post-
increment operator, I can imagine that compilers might be smart enough
to optimize an unnecessary copy away.

SG
 
Reply With Quote
 
Jorgen Grahn
Guest
Posts: n/a
 
      04-27-2011
On Wed, 2011-04-27, SG wrote:
> On 27 Apr., 10:04, Kai-Uwe Bux wrote:
>> Jorgen Grahn wrote:
>> > > [...]
>> > So what about the argument that j++ can be a lot slower for
>> > user-defined overloadings? *Is that just politics, no longer true,
>> > irrelevant, or ... ?

>>
>> To me, that argument _always_ sounded like premature optimization.

>
> Depends on how you define premature optimization. I have no problem
> with writing "++j" instead of "j++" in the first place. It doesn't
> take _any_ more effort, obviously.


Personally, I still find ++j much uglier than j++ [1], so if someone can
convince me that I can stop using the ++j idiom I gladly will. But I
need to be convinced first.

I can't see how a compiler can magically optimize this in the general
case, and I can't see how you can prove that the performance loss is
always negligable. This is the kind of thing you do thousands of times
in tight loops, like the ones in <algorithm> ...

Perhaps I misinterpreted what Kanze wrote.

/Jorgen
[1] No rational reasons for this, just too many years doing C.

--
// Jorgen Grahn <grahn@ Oo o. . .
\X/ snipabacken.se> O o .
 
Reply With Quote
 
Bo Persson
Guest
Posts: n/a
 
      04-27-2011
Jorgen Grahn wrote:
> On Wed, 2011-04-27, SG wrote:
>> On 27 Apr., 10:04, Kai-Uwe Bux wrote:
>>> Jorgen Grahn wrote:
>>>>> [...]
>>>> So what about the argument that j++ can be a lot slower for
>>>> user-defined overloadings? Is that just politics, no longer true,
>>>> irrelevant, or ... ?
>>>
>>> To me, that argument _always_ sounded like premature optimization.

>>
>> Depends on how you define premature optimization. I have no problem
>> with writing "++j" instead of "j++" in the first place. It doesn't
>> take _any_ more effort, obviously.

>
> Personally, I still find ++j much uglier than j++ [1], so if
> someone can convince me that I can stop using the ++j idiom I
> gladly will. But I need to be convinced first.
>
> I can't see how a compiler can magically optimize this in the
> general case, and I can't see how you can prove that the
> performance loss is always negligable. This is the kind of thing
> you do thousands of times in tight loops, like the ones in
> <algorithm> ...
>


The general idea is that an iterator is a light weight, cheap to copy
object. That's one reason for the <algorithm>s to pass iterators by
value.

And in *most* cases, the compiler will be able so see through the
temporary copy created by the post increment, and notice that it is
never used and has no side effects.

The "general case" is harder, but presumable also less frequent.


Bo Persson


 
Reply With Quote
 
Í÷ Tiib
Guest
Posts: n/a
 
      04-27-2011
On Apr 27, 2:33*pm, Jorgen Grahn <(E-Mail Removed)> wrote:
> On Wed, 2011-04-27, SG wrote:
> > On 27 Apr., 10:04, Kai-Uwe Bux wrote:
> >> Jorgen Grahn wrote:
> >> > > [...]
> >> > So what about the argument that j++ can be a lot slower for
> >> > user-defined overloadings? *Is that just politics, no longer true,
> >> > irrelevant, or ... ?

>
> >> To me, that argument _always_ sounded like premature optimization.

>
> > Depends on how you define premature optimization. I have no problem
> > with writing "++j" instead of "j++" in the first place. It doesn't
> > take _any_ more effort, obviously.

>
> Personally, I still find ++j much uglier than j++ [1], so if someone can
> convince me that I can stop using the ++j idiom I gladly will. But I
> need to be convinced first.


I don't see the aesthetics. Perhaps i don't have taste of art. Both
are easy and short to type. It is easy to think of ++j as "increment
j". So it is easy to read and to think of it too.

How to think of j++? "j and then increment"? "increment j and tell
what it was before"? I mostly avoid it because i can't find good
mental image for postfix operator ++.

> I can't see how a compiler can magically optimize this in the general
> case, and I can't see how you can prove that the performance loss is
> always negligable. This is the kind of thing you do thousands of times
> in tight loops, like the ones in <algorithm> ...
>
> Perhaps I misinterpreted what Kanze wrote.


Compilers impress these days and discard most code that does not
affect program's observable behavior. If j++ is less error prone and
confusing for your team to read than ++j then you should agree to
prefer it.
For me it is other way around.
 
Reply With Quote
 
Jorgen Grahn
Guest
Posts: n/a
 
      04-28-2011
On Wed, 2011-04-27, Bo Persson wrote:
> Jorgen Grahn wrote:
>> On Wed, 2011-04-27, SG wrote:
>>> On 27 Apr., 10:04, Kai-Uwe Bux wrote:
>>>> Jorgen Grahn wrote:
>>>>>> [...]
>>>>> So what about the argument that j++ can be a lot slower for
>>>>> user-defined overloadings? Is that just politics, no longer true,
>>>>> irrelevant, or ... ?
>>>>
>>>> To me, that argument _always_ sounded like premature optimization.
>>>
>>> Depends on how you define premature optimization. I have no problem
>>> with writing "++j" instead of "j++" in the first place. It doesn't
>>> take _any_ more effort, obviously.

>>
>> Personally, I still find ++j much uglier than j++ [1], so if
>> someone can convince me that I can stop using the ++j idiom I
>> gladly will. But I need to be convinced first.
>>
>> I can't see how a compiler can magically optimize this in the
>> general case, and I can't see how you can prove that the
>> performance loss is always negligable. This is the kind of thing
>> you do thousands of times in tight loops, like the ones in
>> <algorithm> ...
>>

>
> The general idea is that an iterator is a light weight, cheap to copy
> object. That's one reason for the <algorithm>s to pass iterators by
> value.
>
> And in *most* cases, the compiler will be able so see through the
> temporary copy created by the post increment, and notice that it is
> never used and has no side effects.
>
> The "general case" is harder, but presumable also less frequent.


And if I have a heavy-weight j++ which doesn't inline, that means I'm
already in conflict with <algorithm>. Yes, I can buy that argument.

/Jorgen

--
// Jorgen Grahn <grahn@ Oo o. . .
\X/ snipabacken.se> O o .
 
Reply With Quote
 
 
 
Reply

Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are On
Pingbacks are On
Refbacks are Off


Similar Threads
Thread Thread Starter Forum Replies Last Post
plain iterators and reverse iterators on vector subramanian100in@yahoo.com, India C++ 10 08-08-2009 08:28 AM
Why are "broken iterators" broken? Steven D'Aprano Python 8 09-28-2008 09:19 PM
why why why why why Mr. SweatyFinger ASP .Net 4 12-21-2006 01:15 PM
findcontrol("PlaceHolderPrice") why why why why why why why why why why why Mr. SweatyFinger ASP .Net 2 12-02-2006 03:46 PM
Iterators and reverse iterators Marcin Kalici˝ski C++ 1 05-08-2005 09:58 AM



Advertisments