Velocity Reviews - Computer Hardware Reviews

Velocity Reviews > Newsgroups > Programming > C Programming > Writing "absolutely" portable code

Reply
Thread Tools

Writing "absolutely" portable code

 
 
Nick Bowler
Guest
Posts: n/a
 
      01-17-2012
On Tue, 10 Jan 2012 17:44:07 +0100, Dirk Zabel wrote:
> Am 09.01.2012 23:50, schrieb Keith Thompson:
>> One advantage of GNU autotools, from the perspective of a user
>> installing software from source, is that they impose a consistent set of
>> commands to build and install packages. A typical sequence is:
>>
>> tar zxf foobar-0.1.tar.gz
>> cd foobar-0.1
>> ./configure --prefix=/some/where/foo-0.1

> If it works, fine.
> Unfortunately, this step very often does not work for me. I am regularly
> trying to cross-compile for an embedded, ppc-based linux system, which
> in theory should be possible with the configure-parameter
>
> --host=ppc-XXX-linux
>
> Very often I have to work long until configure works.

[snip rant about various failures that occur during cross compilation].

All of this is the result of a single problem: the package developer(s)
did not test cross compilation, a feature which autoconf was explicitly
designed to support. They did not try it even once. Or if they did,
they did not bother to fix the problems that they doubtless encountered.

> So from this experiences, I would greatly prefer a manually written,
> clean makefile together with some pre-written config.h where the
> system-dependent definitions are visible.


I cannot fathom why anyone would expect a from-scratch build system
written by such developers to feature better support for cross
compilation.
 
Reply With Quote
 
 
 
 
ec429
Guest
Posts: n/a
 
      01-17-2012
On 17/01/12 20:36, Nick Bowler wrote:
> On Tue, 10 Jan 2012 17:44:07 +0100, Dirk Zabel wrote:
>> So from this experiences, I would greatly prefer a manually written,
>> clean makefile together with some pre-written config.h where the
>> system-dependent definitions are visible.

>
> I cannot fathom why anyone would expect a from-scratch build system
> written by such developers to feature better support for cross
> compilation.

Perhaps because a build system that doesn't try to autoguess the values
of system-dependent things, doesn't break when the build machine and
target machine are different?
If you're cross-compiling, and the package uses autoconf, how's it going
to find out the values it normally gets by compiling and running test
programs?
Autotools tries to be a generic tool, and doesn't make the best job of
it. A clean makefile and a config.h has the important advantage of
being /minimal/; it doesn't know, nor seek to know, anything that it
doesn't need - and it doesn't try to be clever, either.

--
'sane', adj.: see 'unimaginative'
on the web - http://jttlov.no-ip.org
 
Reply With Quote
 
 
 
 
Ben Pfaff
Guest
Posts: n/a
 
      01-17-2012
ec429 <(E-Mail Removed)> writes:

> On 17/01/12 20:36, Nick Bowler wrote:
>> On Tue, 10 Jan 2012 17:44:07 +0100, Dirk Zabel wrote:
>>> So from this experiences, I would greatly prefer a manually written,
>>> clean makefile together with some pre-written config.h where the
>>> system-dependent definitions are visible.

>>
>> I cannot fathom why anyone would expect a from-scratch build system
>> written by such developers to feature better support for cross
>> compilation.

> Perhaps because a build system that doesn't try to autoguess the
> values of system-dependent things, doesn't break when the build
> machine and target machine are different?
> If you're cross-compiling, and the package uses autoconf, how's it
> going to find out the values it normally gets by compiling and running
> test programs?


Autoconf doesn't normally try to run test programs. Most of the
time, it only tries to compile and link them. In the cases where
running is desirable, normally one makes the Autoconf tests take
a conservative guess at the result.
--
A competent C programmer knows how to write C programs correctly,
a C expert knows enough to argue with Dan Pop, and a C expert
expert knows not to bother.
 
Reply With Quote
 
Kaz Kylheku
Guest
Posts: n/a
 
      01-18-2012
On 2012-01-17, Ben Pfaff <(E-Mail Removed)> wrote:
> ec429 <(E-Mail Removed)> writes:
>
>> On 17/01/12 20:36, Nick Bowler wrote:
>>> On Tue, 10 Jan 2012 17:44:07 +0100, Dirk Zabel wrote:
>>>> So from this experiences, I would greatly prefer a manually written,
>>>> clean makefile together with some pre-written config.h where the
>>>> system-dependent definitions are visible.
>>>
>>> I cannot fathom why anyone would expect a from-scratch build system
>>> written by such developers to feature better support for cross
>>> compilation.

>> Perhaps because a build system that doesn't try to autoguess the
>> values of system-dependent things, doesn't break when the build
>> machine and target machine are different?
>> If you're cross-compiling, and the package uses autoconf, how's it
>> going to find out the values it normally gets by compiling and running
>> test programs?

>
> Autoconf doesn't normally try to run test programs. Most of the
> time, it only tries to compile and link them. In the cases where
> running is desirable, normally one makes the Autoconf tests take
> a conservative guess at the result.


The conservative guesses are often stupidly inappropriate.

A much better response would be to fail the configure with a big message
at the end like this:

*** The program can't be configured because it is being cross-compiled,
*** and so some of the tests which run test programs cannot be executed.
*** Here are all the variables whose values you need to export to do
*** the right thing (so you don't have to read the configure script):
*** ac_cv_blah=yes # this indicates the target has blah support
*** ac_cv_blorch=yes # the target has blorch support

What happens instead is the program configures successfully anyway. You have
to read the output of configure, spot the problems, gather them into a list,
and then dig through the configure script to find out what variables to
manipulate to get a proper build.

Another better response, rather than failing, would be to try to make better
use of available information.

For instance, suppose that the --target is specified as a tuple which encodes
not only the fact that we are compiling for MIPS or ARM, but also that the OS
is linux. The config system has no excuse for not taking advantage of this
little fact that was given to it as input. If you know that the OS is Linux,
then you know, for instance, that the target OS has POSIX job control, so you
don't have to bother testing for it, and you certainly don't need to assume
that it's not there.

Yet the job control test in GNU Bash's configure script defaults to "no"
if it is not able to run the configure test. Linux or not, it will build
a bash that has no Ctrl-Z, and no "jobs", "fg" or "bg".

A couple of years ago, Wind River was shipping a compiled bash with no job
control because of this issue. When cross compiling bash, they just trusted
that configure did the job because it ran successfully and an executable popped
out make.

It's very hard to test for every single thing like this across an entire OS.
You never know where something is crippled because a cross-compiling test
was *silently* not run and a poor-quality default substituted in its place.
 
Reply With Quote
 
Seebs
Guest
Posts: n/a
 
      01-18-2012
On 2012-01-17, Nick Bowler <(E-Mail Removed)> wrote:
> I cannot fathom why anyone would expect a from-scratch build system
> written by such developers to feature better support for cross
> compilation.


Well, in practice:
Autoconf can in theory support cross compilation, but you basically have to know
you want to support that or else you will end up with a ton of stuff which doesn't
work as expected. While the package as a whole has hooks, many specific tests
don't, and users often end up writing their own tests which don't know about cross
compilation.

A well-written app with a config.h will usually have the right pieces exposed to
let us write the correct values, generate a patch to create that config.h, and
will then Just Work. So in practice my experience has been that, for a whole lot
of packages, a developer completely unaware of the concept of cross-compilation
will leave me with less hassle than an autoconf based app, even in some cases one
which makes an effort to support cross-compilers.

This is all subject to rapid change. Back when I first tried to use autoconf for
cross-compilation, the built-in type size testing didn't work for cross compiles,
because it depended on compiling and running test applications... A package which
is being used by embedded developers, and whose maintainers are friendly to
submitted patches, will usually be fine by now.

To put it in perspective, I had less trouble porting pseudo from Linux to OS X
than I have had getting some apps to cross-compile for a different CPU type.

-s
--
Copyright 2011, all wrongs reversed. Peter Seebach / http://www.velocityreviews.com/forums/(E-Mail Removed)
http://www.seebs.net/log/ <-- lawsuits, religion, and funny pictures
http://en.wikipedia.org/wiki/Fair_Game_(Scientology) <-- get educated!
I am not speaking for my employer, although they do rent some of my opinions.
 
Reply With Quote
 
Joachim Schmitz
Guest
Posts: n/a
 
      01-18-2012
io_x wrote:
> "Ben Pfaff" <(E-Mail Removed)> ha scritto nel messaggio
> news:(E-Mail Removed)...

<snip>
>
> afther say that, i kill file you and Ian Collins


They will be deeply impressed.
 
Reply With Quote
 
Stephen Sprunk
Guest
Posts: n/a
 
      01-18-2012
On 18-Jan-12 00:26, io_x wrote:
> "Ben Pfaff" <(E-Mail Removed)> ha scritto nel messaggio
> news:(E-Mail Removed)...
>> ec429 <(E-Mail Removed)> writes:
>> A competent C programmer knows how to write C programs correctly,
>> a C expert knows enough to argue with Dan Pop, and a C expert
>> expert knows not to bother.

>
> and where are the ones, the peoples claim nether
> the expert expert can write a C programs correctly 100%?
> [only assembly programmers could write them afther extensive test]
>
> do you call them not expert? right?


That is the logical conclusion. If a competent programmer knows how to
write programs correctly, then anyone who does not know how to write
programs correctly cannot be a competent programmer.

However, programmers are human and will make mistakes; the difference is
that competent ones recognize and correct their mistakes, whereas
incompetent ones often will not. In my experience, even when those
mistakes are pointed out by a competent programmer, the incompetent
programmer tends to defend his mistakes rather than correct them.

S

--
Stephen Sprunk "God does not play dice." --Albert Einstein
CCIE #3723 "God is an inveterate gambler, and He throws the
K5SSS dice at every possible opportunity." --Stephen Hawking
 
Reply With Quote
 
James Dow Allen
Guest
Posts: n/a
 
      01-18-2012
Question is: Will you port the program to just
3 or 4 architectures, or to dozens of different
architectures? Hand-steering the process 2 or 3
times is more reliable than a full-featured config
setup (and probably easier to do bug-free).

When I needed self-configured architecture-dependent
code, I just used ordinary 'make' with dependent
executables run to output tailored '.h' files.

* * * *

On another topic:
Is there a thread to choose best c.l.c expert/poster ?
My nominees are Eric Sosman and Ben Pfaff.
This makes the following comment seem rather ... odd.


> On Jan 18, 1:26*pm, "io_x" <(E-Mail Removed)> wrote:
> "Ben Pfaff" <(E-Mail Removed)> ha scritto nel
> > A competent C programmer knows how to write C programs correctly,
> > a C expert knows enough to argue with Dan Pop, and a C expert
> > expert knows not to bother.

>
> afther say that, i kill file you and Ian Collins
> ... because not interested from wrong answers


Jamesdowallen (c/o Gmail)

 
Reply With Quote
 
Seebs
Guest
Posts: n/a
 
      01-18-2012
On 2012-01-18, Kaz Kylheku <(E-Mail Removed)> wrote:
> A couple of years ago, Wind River was shipping a compiled bash with no job
> control because of this issue. When cross compiling bash, they just trusted
> that configure did the job because it ran successfully and an executable popped
> out make.


It's more subtle than that, apparently, and was most recently fixed in January
of 2007. Looking at it, it appears that all the standard stuff in bash's
configure uses a value bash_cv_job_control_missing, which would normally be
set to either "present" or "missing". We had a build file which was setting it
to "no", but was not setting it in the right way; it looks to me as though
the intent had been the logical equivalent of:
bash_cv_job_control_missing=no configure <args>
but it ended up working as though it were:
bash_cv_job_control_missing=no
configure <args>

So it wasn't trusting configure, it wasn't being careful enough to verify that
the fix was still working; it had been originally fixed back in 2006, and then a
change in July of 2006 moved the fix to a new location as part of a version uprev,
and it stopped working -- this slipped through the cracks for about 5 months.

.... this has basically nothing to do with C, but I was too curious not to look.

-s
--
Copyright 2011, all wrongs reversed. Peter Seebach / (E-Mail Removed)
http://www.seebs.net/log/ <-- lawsuits, religion, and funny pictures
http://en.wikipedia.org/wiki/Fair_Game_(Scientology) <-- get educated!
I am not speaking for my employer, although they do rent some of my opinions.
 
Reply With Quote
 
Kaz Kylheku
Guest
Posts: n/a
 
      01-18-2012
On 2012-01-18, Seebs <(E-Mail Removed)> wrote:
> On 2012-01-18, Kaz Kylheku <(E-Mail Removed)> wrote:
>> A couple of years ago, Wind River was shipping a compiled bash with no job
>> control because of this issue. When cross compiling bash, they just trusted
>> that configure did the job because it ran successfully and an executable popped
>> out make.

>
> It's more subtle than that, apparently, and was most recently fixed in January
> of 2007. Looking at it, it appears that all the standard stuff in bash's
> configure uses a value bash_cv_job_control_missing, which would normally be
> set to either "present" or "missing". We had a build file which was setting it
> to "no", but was not setting it in the right way; it looks to me as though
> the intent had been the logical equivalent of:
> bash_cv_job_control_missing=no configure <args>
> but it ended up working as though it were:
> bash_cv_job_control_missing=no
> configure <args>


Ah, let me guess, missing backslash?

Thanks for providing me the necessary emotional closure on this one, haha.
 
Reply With Quote
 
 
 
Reply

Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are On
Pingbacks are On
Refbacks are Off


Similar Threads
Thread Thread Starter Forum Replies Last Post
Writing portable code... jacob navia C Programming 7 09-03-2007 10:29 PM
Portable Python - free portable development environment ! perica.zivkovic@gmail.com Python 7 01-13-2007 11:19 AM
Writing portable code in Visual Studio C++ Richard Giuly C++ 5 07-31-2006 03:49 PM
portable (VHDL) vs. non-portable (altera LPM) approaches to signed computations Eli Bendersky VHDL 1 03-01-2006 02:43 PM
Writing portable software Jason Curl C Programming 10 03-15-2005 10:29 PM



Advertisments