[crosstool-NG] Design discussion
Rob Landley
rob@landley.net
Thu Apr 9 01:13:00 GMT 2009
On Wednesday 08 April 2009 05:22:09 Ladislav Michl wrote:
> On Tue, Apr 07, 2009 at 07:29:08PM -0500, Rob Landley wrote:
> > On Tuesday 07 April 2009 07:39:19 Ladislav Michl wrote:
> > > PTXdist (and this project is worth to check as well;
> > > http://www.pengutronix.de/software/ptxdist/index_en.html). And without
> > > looking at crosstool-ng source too closely, it looks like ./configure
> > > and menuconfig are used in the same way. So ./configure && make is used
> > > to build the tool itself and menuconfig serves to configure toolchain
> > > options. Doesn't it seem reasonable enough solution?
>
> [snip]
>
> > Ok, "./configure --prefix=`pwd`/tmpdir" and then "make install" into
> > that... and it just copied the patches directory it insisted I untar into
> > its source directory. Went into tmpdir, ran bin/ptxdist, and it spit out
> > 68 lines according to wc.
> >
> > Ok, this is another package doing this, but that doesn't make it right.
> > Why on _earth_ would you need to install source code before you're
> > allowed to configure and compile it? The only thing it actually built
> > when I did the first "make" was the kconfig binaries. The kernel (where
> > kconfig comes from) does not require you to make and install kconfig
> > before using menuconfig. Neither do long-term users of it like busybox or
> > uClibc...
>
> You do not need install PTXdist anywhere to start using it.
Uh-huh. Starting from a fresh tarball...
$ make menuconfig
make: *** No rule to make target `menuconfig'. Stop.
$ bin/ptxdist menuconfig
ptxdist: error: PTXdist in /home/landley/ptxdist-1.0.2 is not built.
$ make
make: *** No targets specified and no makefile found. Stop.
$ ./configure
checking for ptxdist patches... yes
checking for gcc... gcc
...
ptxdist version 1.0.2 configured.
Using '/usr/local' for installation prefix.
# Try crosstool-ng's way?
$ ./configure --local
configure: error: unrecognized option: --local
Try `./configure --help' for more information
It seems more accurate to say there might be a non-obvious workaround for the
need to install it before using it.
> Install part is
> optional just in case you want to distribute it as binary tarball to your
> colleagues
Isn't what we downloaded already a tarball? The only "binary" part seems to
be the kconfig binaries...
> or make (for example) debian package. However I have to admit
> that it is non obvious. PTXdist's kconfig is hacked a bit to handle
> dependencies, so if you want to express openssh dependency on openssl you
> do so in Kconfig file only.
Doesn't kconfig normally track and enforce dependencies? I thought that was
one of its main functions...
> The 'ptxdist' script could probably do that as
> well, but unless you hack on PTXdist itself you are expected to run
> ./configure && make only once.
You don't have to install the linux kernel source code before building a Linux
kernel, and the kbuild infrastructure is more complicated than some entire
embedded operating systems I've seen. The argument "but multiple people may
want to share a patched version to build for several different targets" seems
to apply just as much to the kernel as much to a cross toolchain, yet they
never had a need for this extra step. (You _can_ extract the tarball
in /usr/src/linux and then build out of tree...)
I don't remember uClinux, openwrt, or buildroot expecting you to configure,
make, and install them before being allowed to run menuconfig to set up the
actual build they do. (Admittedly I haven't poked at any of them in the past
week, so maybe I'm forgetting something...) They do the same "download lots
of third party tarballs, compile them resolving dependencies between them,
and integrate the result into a coherent whole" task...
I'm just wondering where this "source code should be installed before you can
compile it, just extracting a tarball isn't good enough, you need to run
configure twice and make twice" meme came from. I still don't understand the
supposed advantage...
> > Getting back to your original point, "building the tool itself" in this
> > context apparently means building the menuconfig binary, because
> > bin/ptxdist itself is a bash script. I have no idea what the first
> > ./configure; make; make install cycle is expected to accomplish. (Other
> > than hardwiring in absolute paths you can detect at runtime with the pwd
> > command or perhaps by some variant of readlink -f "$(which "$0")" if you
> > want to be _more_ flexible than the make command...)
>
> There is nothing hardwired. ./configure checks prerequisities (and searches
> for curses library). Of course it would be nice to have as few
> prerequisities as possible and this is limited by amount of human
> resources. Once upon a time the idea was to let PTXdist build
> prerequisities on its own as a host tools, but that has its own set of
> problems.
Tell me about it. (Took me almost two years to make everything work right...)
I understand why crosstool-ng is installing prerequisites now. It's aimed at
performing almost a forensic task: reproducing the exact toolchain that was
used to build an existing binary root filesystem, which some vendor somewhere
had but didn't bother to ship, so you have to reverse engineer it. In that
context, the set of environmental dependencies your build requires really
_can't_ be contained, because you don't really know at ./configure time what
they'll _be_ and there are two many different upstream package versions to
police closely and try to clean up yourself.
This is a task I'm happy to leave to Yann. It's way too fiddly for me... :)
> > Will I someday have to compile and install makefiles before I can build
> > the package they describe? I do not understand what these extra steps
> > accomplish...
>
> See above.
I read it, I just don't _get_ it.
> Anyway, this is starting to be off topic, so in case you want anything to
> be improved (and you did few valid points here), fell free to start another
> thread called for example "Why PTXdist sucks" (such a subjects tend to
> attract attention) to prevent this one from pollution.
Oh I think all modern software sucks. My monday blog entry was, just for
exercise, why my _own_ build system sucks:
http://landley.net/notes.html#06-04-2009
(And that's by no means even close to a complete list, that was just off the
top of my head in the five minutes I was willing to spend typing about it.)
Keep in mind that software in general is still darn new and completely
primitive. The original IBM PC was our industry's "Model T", and that
analogy says we have yet to even invent the 3-point seat belt.
The Model T was 1927, the IBM PC was 1981, which means we're currently circa
1955 or so. Google for "1950's automobile pictures" some time, then tell me
that everything we've currently got hasn't got _huge_ room for
improvement. :)
I'm trying to ask the "Do we really _need_ fins? What about gas mileage? Is
lead really _necessary_ in gasoline?" type questions. Even if everything
(including the stuff I've written) currently gets this sort of thing wrong,
it should still be possible to do _better_...
Rob
--
GPLv3 is to GPLv2 what Attack of the Clones is to The Empire Strikes Back.
--
For unsubscribe information see http://sourceware.org/lists.html#faq
More information about the crossgcc
mailing list