Velocity Reviews - Computer Hardware Reviews

Velocity Reviews > Newsgroups > Programming > C++ > What has C++ become?

Reply
Thread Tools

What has C++ become?

 
 
James Kanze
Guest
Posts: n/a
 
      06-11-2008
On Jun 10, 11:36 am, Ian Collins <(E-Mail Removed)> wrote:
> James Kanze wrote:
> > On Jun 10, 12:39 am, (E-Mail Removed) wrote:


> >> If your process is designed for rapid building to offset
> >> the cost of extra coupling then the advantages of templates
> >> may outweigh the cost. If a clean build of your project
> >> takes a long time, the productivity cost will outweigh any
> >> benefits.


> > The clean build isn't the problem. You can schedule that
> > overnight, or for a weekend. (For my library, a clean build
> > for all of the versions I support under Unix takes something
> > like eight hours. Which doesn't bother me too much.) The
> > problem is the incremental builds when someone bug-fixes
> > something in the implementation. For non-templates, that
> > means recompiling a single .cc file; for templates,
> > recompiling all source files which include the header. A
> > difference between maybe 5 seconds, and a couple of minutes.
> > Which is a very significant difference if you're sitting in
> > front of the computer, waiting for it to finish.


> You can say the same for a change to any header.


Yes. Which is why you don't want to modify headers more often
than necessary. And why you ban as many implementation details
as possible in the headers, and use the compilation firewall
idiom rather regularly. And strictly limit the use of templates
and inline functions, since both require the implementation in
the header.

--
James Kanze (GABI Software) email:(E-Mail Removed)
Conseils en informatique orientée objet/
Beratung in objektorientierter Datenverarbeitung
9 place Sémard, 78210 St.-Cyr-l'École, France, +33 (0)1 30 23 00 34
 
Reply With Quote
 
 
 
 
James Kanze
Guest
Posts: n/a
 
      06-11-2008
On Jun 11, 12:11 am, Noah Roberts <(E-Mail Removed)> wrote:
> James Kanze wrote:
> > The clean build isn't the problem. You can schedule that
> > overnight, or for a weekend. (For my library, a clean build
> > for all of the versions I support under Unix takes something
> > like eight hours. Which doesn't bother me too much.) The
> > problem is the incremental builds when someone bug-fixes
> > something in the implementation. For non-templates, that
> > means recompiling a single .cc file; for templates,
> > recompiling all source files which include the header. A
> > difference between maybe 5 seconds, and a couple of minutes.
> > Which is a very significant difference if you're sitting in
> > front of the computer, waiting for it to finish.


> See the "Stable Dependencies Principle" and the "Stable Abstractions
> Principle".


> http://www.objectmentor.com/resource.../stability.pdf


> "Thus, the software that encapsulates the *high level design
> model* of the system should be placed into stable packages."


> - Emphasis added -


> "[The Stable Abstractions Principle] says that a stable
> package should also be abstract so that its stability does not
> prevent it from being extended."


> Robert C. Martin's article on stability principles pretty much
> stands against everything you've said in this thread to date.


You've obviously not understood the article, or what I've been
saying. The abstraction and the design should be stable. It's
implementation won't necessarily be. The problem with C++
templates, and currently implemented by most compilers, is that
they require the implementation in places where logically, you
should only have the abstraction. And thus introduce
instability in places where you don't want it.

> Templates are the epitome of abstraction.


I wouldn't go that far. They're a tool which can help to
implement certain types of abstraction.

> Perhaps if you were not so anti-template


I'm not anti-template. I'm very much in favor of templates. So
much, in fact, that I'd actually like to seem compilers
implement them in a way that was usable in practice (with
export). My complaints aren't with templates; they're with the
cruddy implementations I have have of them.

> you'd do some looking into how to make the best use of them
> and you would not be arguing about changing templates causing
> long builds; you'd be well aware that you simply don't change
> templates that often.


That's true for things like the standard library, and lower
level code. On the other hand, if you're not changing your
application, then what are you doing when you program.

[...]
> Of course, you need to go back and read about the other design
> principles that Martin describes in order to see the entire
> reasoning behind why you put the *high level code* in your
> stable, abstract packages. I'm not begging an authority,
> Martin's stuff just happens to be very good and the reasoning
> stands on its own.


> The principles of OOD translate very well to Generic Programming.


I have a very high regard for Robert Martin; he's one of the
people who taught me C++. The problem is simply pragmatic;
templates don't really work with most compilers. For most
compilers, they're really just elaborate macros, with all the
problems macros entail.

--
James Kanze (GABI Software) email:(E-Mail Removed)
Conseils en informatique orientée objet/
Beratung in objektorientierter Datenverarbeitung
9 place Sémard, 78210 St.-Cyr-l'École, France, +33 (0)1 30 23 00 34
 
Reply With Quote
 
 
 
 
James Kanze
Guest
Posts: n/a
 
      06-11-2008
On Jun 11, 4:54 am, Michael Furman <(E-Mail Removed)> wrote:
> James Kanze wrote:
> > ....


> > The clean build isn't the problem. You can schedule that
> > overnight, or for a weekend. (For my library, a clean build
> > for all of the versions I support under Unix takes something
> > like eight hours. Which doesn't bother me too much.) The
> > problem is the incremental builds when someone bug-fixes
> > something in the implementation. For non-templates, that
> > means recompiling a single .cc file; for templates,
> > recompiling all source files which include the header. A
> > difference between maybe 5 seconds, and a couple of minutes.
> > Which is a very significant difference if you're sitting in
> > front of the computer, waiting for it to finish.


> I love when compilation takes more then a couple of seconds: I
> have extra time to think! Sometimes it ends with killing the
> compilation and doing something else, rather then trying the
> result.


Interesting development process. I usually try to think before
editing, much less compiling.

--
James Kanze (GABI Software) email:(E-Mail Removed)
Conseils en informatique orientée objet/
Beratung in objektorientierter Datenverarbeitung
9 place Sémard, 78210 St.-Cyr-l'École, France, +33 (0)1 30 23 00 34
 
Reply With Quote
 
Ian Collins
Guest
Posts: n/a
 
      06-11-2008
James Kanze wrote:
> On Jun 10, 11:36 am, Ian Collins <(E-Mail Removed)> wrote:
>> James Kanze wrote:
>>> On Jun 10, 12:39 am, (E-Mail Removed) wrote:

>
>>>> If your process is designed for rapid building to offset
>>>> the cost of extra coupling then the advantages of templates
>>>> may outweigh the cost. If a clean build of your project
>>>> takes a long time, the productivity cost will outweigh any
>>>> benefits.

>
>>> The clean build isn't the problem. You can schedule that
>>> overnight, or for a weekend. (For my library, a clean build
>>> for all of the versions I support under Unix takes something
>>> like eight hours. Which doesn't bother me too much.) The
>>> problem is the incremental builds when someone bug-fixes
>>> something in the implementation. For non-templates, that
>>> means recompiling a single .cc file; for templates,
>>> recompiling all source files which include the header. A
>>> difference between maybe 5 seconds, and a couple of minutes.
>>> Which is a very significant difference if you're sitting in
>>> front of the computer, waiting for it to finish.

>
>> You can say the same for a change to any header.

>
> Yes. Which is why you don't want to modify headers more often
> than necessary. And why you ban as many implementation details
> as possible in the headers, and use the compilation firewall
> idiom rather regularly. And strictly limit the use of templates
> and inline functions, since both require the implementation in
> the header.
>

Inline functions tend not to be too big an issue in practice. They tend
to be trivial and "write once".

As for templates, I leave that call to the team. By their nature,
templates tend to be used for utility functions which, like trivial
inline function, tend to change less frequently than specific
application code.

Another common situation is a template will be introduced to avoid
unnecessary duplication, it is only used where the duplication would
occur. If the template wasn't there, the duplicated code would have to
change instead of the template. Without the template, the cost in
recompilation would be the same but the cost of coding changes would be
greater.

So I wouldn't go so far as to say strictly limit the use of templates
and inline functions, just treat them like any tool.

Gratuitous use of templates is another matter, but peer pressure should
solve that problem.

--
Ian Collins.
 
Reply With Quote
 
Jerry Coffin
Guest
Posts: n/a
 
      06-11-2008
In article <667de91e-9fa1-4631-9e0d-caddab7db006
@d45g2000hsc.googlegroups.com>, http://www.velocityreviews.com/forums/(E-Mail Removed) says...

[ ... ]

> (In the end, much advanced optimization involves visiting nodes
> in a graph, and I think that there are ways to parallelize
> this, although I don't know whether they are pratical or only
> theoretical.)


Yes and no. For example, a depth-first-search of a general graph has
been studied pretty extensively. I don't know of anything _proving_ that
it can't be done in parallel productively, but there are definitely some
pretty strong indications in that direction*.

OTOH, I believe for a compiler you're dealing primarily with DAGs. I'm
pretty sure a depth-first search of a DAG _can_ be done in parallel
productively -- at least if they're large enough for the savings from
parallelization to overcome communication overhead and such.

I'm not sure whether a compiler typically generates DAGs that large or
not. I've written a few small compilers, but don't recall having
instrumented the size of graphs they worked with. My guess is that if
you only do function-level optimization, they're usually going to be too
small for it to help, but if you do global optimization, they might
easily become large enough -- but that's purely my feeling; I don't have
any solid data to support it, and we all know how undependable that is.

[ ... ]

> And for the application headers, even farming the compiles out
> to different machines (in parallel) may not work; since the
> application headers will normally reside on one machine, you may
> end up saturating the network. (I've seen this in real life.
> The usual ethernet degrades rapidly when the number of
> collisions gets too high.)


Does anybody really use hubs anymore? Using switched Ethernet,
collisions are quite rare, even when the network is _heavily_ loaded.

--
Later,
Jerry.

The universe is a figment of its own imagination.
 
Reply With Quote
 
Jerry Coffin
Guest
Posts: n/a
 
      06-11-2008
In article <(E-Mail Removed)>,
(E-Mail Removed) says...

[ ... ]

> Yes and no. For example, a depth-first-search of a general graph has
> been studied pretty extensively. I don't know of anything _proving_ that
> it can't be done in parallel productively, but there are definitely some
> pretty strong indications in that direction*.


Oops -- I left out the footnote I intended there:

J.H. Reif: _Depth-first Search is Inherently Sequential_, Information
Processing Letters 20. 1985

--
Later,
Jerry.

The universe is a figment of its own imagination.
 
Reply With Quote
 
Noah Roberts
Guest
Posts: n/a
 
      06-11-2008
James Kanze wrote:
> The problem is simply pragmatic;
> templates don't really work with most compilers. For most
> compilers, they're really just elaborate macros, with all the
> problems macros entail.


LOL! And the story changes yet again.
 
Reply With Quote
 
Noah Roberts
Guest
Posts: n/a
 
      06-11-2008
James Kanze wrote:

>> Robert C. Martin's article on stability principles pretty much
>> stands against everything you've said in this thread to date.

>
> You've obviously not understood the article, or what I've been
> saying. The abstraction and the design should be stable. It's
> implementation won't necessarily be. The problem with C++
> templates, and currently implemented by most compilers, is that
> they require the implementation in places where logically, you
> should only have the abstraction. And thus introduce
> instability in places where you don't want it.


I'm afraid it is you who have not understood the article. Perhaps you
are not familiar with the dependency inversion principle.

You're calling the STL "low level code" when it is, in fact, high level
code. The find_if algorithm, for instance, is a high level algorithm
that can be used with any data type that obeys the iterator abstraction.
The containers, for another example, are generic, high level
constructs that can contain any data type that implements the required
concepts. As that article states, the stability and the dependencies
move toward high level code and templates have a tendency to be very
independent and lacking in many dependencies.

As high level code, the internals of templates simply do not change very
often. For example, there's a limited number of ways to do a qsort.
With the concepts established that are required to do such a sort, the
implementation of the *high level* algorithm does not need to change.
What has to change is any object that wants to depend upon the qsort
template so that it implements the concepts that the qsort interface
imposes upon its clients.

In other words, the dependency in the relationship between an STL
container and its contents is not the container depending on the
contents, but the other way around. The STL container depends on no
concrete object, imposes an interface on all its contents, and works at
a more abstract level than the objects it contains. It is an abstract
kind of object and this is exactly what *high level* code is.

I simply do not know why you are calling these things "low level code"
and you've spent no time defending that assertion whatsoever. This is
not out of unawareness of my disagreement for I've expressed it several
times. Robert Martin says exactly the same thing I've just expressed in
the article I've cited and in his article on the Dependency Inversion
Principle.

Furthermore, as expressed in this later article, his statements are
applying to the implementation of higher level objects and not just
their interfaces. This is implied by his Open/Closed Principle just
like all the other principles he talks about. That principle states
that an object should be open to extension, but *closed to
modification*. Any time you change the implementation of any function
or object you've violated that principle.
 
Reply With Quote
 
Ian Collins
Guest
Posts: n/a
 
      06-11-2008
Walter Bright wrote:
> James Kanze wrote:
>> And for the application headers, even farming the compiles out
>> to different machines (in parallel) may not work; since the
>> application headers will normally reside on one machine, you may
>> end up saturating the network. (I've seen this in real life.
>> The usual ethernet degrades rapidly when the number of
>> collisions gets too high.)

>
> I think Symantec C++ was the first to do distributed builds with their
> "netbuild" feature in the early 90's. The problem was, as you said, the
> network congestion of transmitting all the header files around more than
> ate up the time saved.


Not too big a deal with a modern OS and a big enough bucket of RAM to
cache them.

--
Ian Collins.
 
Reply With Quote
 
coal@mailvault.com
Guest
Posts: n/a
 
      06-11-2008
On Jun 11, 1:43*pm, Walter Bright <(E-Mail Removed)>
wrote:
> James Kanze wrote:
> > And for the application headers, even farming the compiles out
> > to different machines (in parallel) may not work; since the
> > application headers will normally reside on one machine, you may
> > end up saturating the network. *(I've seen this in real life.
> > The usual ethernet degrades rapidly when the number of
> > collisions gets too high.)

>
> I think Symantec C++ was the first to do distributed builds with their
> "netbuild" feature in the early 90's. The problem was, as you said, the
> network congestion of transmitting all the header files around more than
> ate up the time saved.


Did they use any compression? That should be used in my opinion.

Brian Wood
Ebenezer Enterprises
www.webEbenezer.net
 
Reply With Quote
 
 
 
Reply

Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are On
Pingbacks are On
Refbacks are Off


Similar Threads
Thread Thread Starter Forum Replies Last Post
Some shareware has a time limit and the software will not work after the time limit has expired. anthony crowder Computer Support 20 01-16-2007 10:01 AM
When a control on form has blank value or has no items (dropdownlist) then it wont' be in Request.Forms TS ASP .Net 3 10-06-2006 01:29 PM
The printing has been stopped and this job has been add to the queu? dejola Computer Support 6 12-30-2005 03:26 AM
Downloaded document has disappeared by the time Word has opened Rob Nicholson ASP .Net 12 12-06-2005 04:59 PM
ZoneAlarm has detected a problem with your installation, and therefore has restricted Internet access from your machine for your protection. Don’t panic A Teuchter Computer Support 2 05-19-2005 09:20 PM



Advertisments