[crosstool-NG] Design discussion

Rob Landley rob@landley.net
Tue Apr 7 07:51:00 GMT 2009


On Monday 06 April 2009 15:39:19 Yann E. MORIN wrote:
> On Monday 06 April 2009 00:45:30 Rob Landley wrote:
> > A) Your definition of "done" does not involve having actually configured
> > or built a toolchain yet.  This "done" is equivalent to "I just extracted
> > the tarball" in any other project.
>
> Which about to be true! ;-)
>
> > B) What other project has both ./configure and menuconfig?  Generally,
> > you have one or the other, not both.
>
> Except they do not serve the same purpose!

Agreed, I just find the need for so much configuration you need two complete 
configuration systems a bit disturbing.

> - ./configure is to configure the crosstool-NG "program"
>
> - "menuconfig" is the same as if you fired vi and edited some variables.
>   Except that it tries to provide a more user-friendly interface.

The distinction between the crosstool-NG "program" and actually using 
crosstool to build something is one of the things I still don't see the 
reason for.

As for what the first ./configure currently does, your "is everything 
installed" tests that ./configure currently does could just as easily go in 
ct-ng.  If you really need some of the output in a Makefile, you could 
generate a "Makefile.inc" snippet or some such that only gets regenerated 
(and thus the tests re-run) if it's not there.

As for the rest of it, I don't see why you can't skip straight to running 
ct-ng out of a freshly extracted tarball without having to 
do "./configure --local; make; make install" first.

> > C) I didn't check out out of svn.  (I don't want to start out with a
> > random svn snapshot, because I have a knack for picking broken ones.  I
> > want to first try a release version with known bugs, that presumably
> > worked for somebody at some point.)
>
> Agreed. Now, the 1.3 series is now 4-month old, and the trunk has add quite
> a number of enhancements, although not the ones _you_ would expect.

Possibly it's unfair to critique it then.

> > E) Why isn't that the default?  (Why even _have_ --local be an option?)
>
> Well, I meant it to be installable for a two main reasons:
>
> - share the same crosstool-NG so that users on the same system can build
>   different toolchains for different (or identical) platforms, depending
>   on their needs. We have a build machine at work, where users log in to
>   build their software, and some have the right (project-wise) to build
>   their own toolchains (notably when a new platforms arrives).

They can't extract a copy into their home directory?  (You already have "local 
tarballs directory", which is presumably set per .config since menuconig 
edits it.  So either they're downloading their own copies of all the source 
packages or they have to set all their projects to use the shared tarballs 
directory...)

I'm a bit confused by the need for special infrastructure so that multiple 
users can each build their own copy from source.  Don't all source tarballs 
work that way?  What resources are actually shared?

> - make it packageable by any distribution (I know that it;s been in the
>   OpenSUSE factory for a while now, even if it's not in the main distro)
>   I'm planning to make a Debian package (in fact two: one with the core,
>   named smthg like ct-ng-[version]-noarch.deb, and one with the patchsets,
>   named ct-ng-data-[version]-noarch.deb)

Again, confused.  There are .rpm and .deb packages of every source package out 
there, including the ones that don't support building out of tree.  And there 
are .srpms of things that just come in normal source tarballs, which would 
be "most of them"...?

How does this apply?

> > F) Running ./configure with no arguments died because awk doesn't call
> > itself gawk, and that's where I punted until I felt more awake.  The
> > "fight with getting it installed again" was:
> >
> >   1) Run ./configure, watch it die, install gawk.
> >   2) Run ./configure, watch it die, install bison.
> >   3) Run ./configure, watch it die, install flex.
> >   4) Run ./configure, watch it die, try to install makeinfo, realize that
> > ubuntu hasn't got a package called makeinfo, install its suggestion
> > of "texi2html", find out that ./configure still claims it hasn't got
> > makeinfo, do "aptitude search makeinfo" and come up with nothing, think
> > for a bit remember that ubuntu has a fiddly thing its shell does when you
> > mistype a command to suggest some random package you could install that
> > will give you an "sl" command, type makeinfo at the command line, have it
> > suggest texinfo, install texinfo.
> >   5) Run ./configure, watch it die, install automake.
> >   6) Run ./configure, watch it die, install libtool.
>
> So you're suggesting that I continue with the checks, marking all missing
> tools, reporting those at the end, and then aborting. Right?

Well, what I'd suggest is that most of those checks are for things the build 
doesn't actually seem like it should _need_.  (You're installing automake but 
not autoconf?  Yet you have a menuconfig option to let the config.guess stuff 
be overridden?  Um... ok?)

But if you're going to do them yes it would be nice to get the full list you 
need to install all at once.  Just a little niceness to people like me trying 
to install it for the first time.

> No ./configure I know of behaves like this. So I was quite dumb, and
> followed the herd. Thanks to you, a few steps further, I had fallen over
> the bridge...
>
> > And _then_ it tells me it wants to install in /usr/local, which I can
> > work around with --prefix=`pwd` or its synonym --local,
>
> *NOOOO!!!*** --prefix=`pwd` is *not* the same as --local! Argghh...

Why not?  (I honestly don't know.  Seemed the same to me...)

> > except I did `pwd`/subdir
> > because I wanted to see what it was actually installing.
> >
> > The first thing I'd like to say about the prerequisite packages in step F
> > is that on this laptop I've already built my Firmware Linux system for
> > armv4l, armv5l, armv4eb, mips, mipsel, powerpc, powerpc-440, x86, x86-64,
> > sparc, sh4, m68k, and a couple of hardware variants like the wrt610n and
> > User Mode Linux. I didn't need _any_ of those packages to build binutils,
> > gcc (with g++), uClibc, uClibc++, busybox, make, bash, distcc, and the
> > linux kernel.
> >
> > You should never need the name "gawk", the SUSv4 name is awk:
> >   http://www.opengroup.org/onlinepubs/9699919799/utilities/awk.html
> >
> > And gawk installs such a symlink.  The FSF ./configures will detect "awk"
> > and use it.  The version Ubuntu installs by default is just "awk", the
> > version busybox provides is also "awk", and they work fine.
>
> No, they don't for me. I'm using GNU extensions. I know its bad. They are
> going away...

*shrug*  Ok.  (I just knew that the packages you were building didn't need 
them, didn't occur to me your scripts would.  My bad.  Not a lot of people 
make extensive use of awk anymore...)

> > The FSF packages ship with cached lex and yacc output, same for autoconf.
> >  It won't regenerate those if the host packages are there, but use the
> > cached versions.  The Linux kernel also has _shipped files for the few
> > things that involved yacc (mainly the menuconfig stuff), and I believe
> > that actually won't regenerate those unless told to do so manually.
>
> Ah, but there is a bug in either Gentoo *or* one of the MPFR tarballs. It
> works great on me Debian. I have had reports it was successful on Fedora.
> I have seen it work seamlessly on OpenSUSE. It broke under Gentoo:
>     http://sourceware.org/ml/crossgcc/2008-05/msg00080.html
>     http://sourceware.org/ml/crossgcc/2008-06/msg00005.html

So it's a workaround for a bug in gentoo (overly aggressive "sanity check") 
and the workaround impacts all platforms.  Not an optimal solution, but at 
least an understandable one.

> > The "info" documentation format is obsolete.  It was the FSF's attempt to
> > replace man pages 20 years ago, and it utterly failed.  There was zero
> > uptake outside the FSF, partly because their interface was based on
> > "gopher" which lost out to the web circa 1993.
>
> I don't care. Some components will not build without it, they want to
> update their documentation, and I can't spend time on fixing those
> suckers. So requiring makeinfo is easier than trying to do without it.

Which ones?  (I've never hit that.)

> > To make a long story short (too late!), everything that uses makeinfo
> > uses autoconf, and will skip it if it isn't there.
>
> Not true in practice.

I'll take your word you hit bugs, but I am curious what they are and how hard 
patching them would be.

> > I mentioned libtool last time.
>
> I already answered to that one.
>
> > One way to make cross compiling easier is
> > to _uninstall_ things like libtool so they can't screw stuff up.
>
> But I can't demand to the end user that he/she removes packages on his
> machine that might be usefull to him/her!

That's why I trimmed the $PATH. :)

Your approach is more conventional, but you're also operating under the 
constraint of trying to support a big polynomial X*Y*Z many different 
possible combinations of different package selections and different package 
versions for different target platforms, which you can't possibly hope to 
test all of, and the way you've set it up you don't even know what today's 
combination will be yet when you run your environment checks, so you can't 
even tag "package version x requires automake" if you wanted to get more 
granular.  You have to support every possibility up front, so requiring a 
conservative environment in which you can build every possible combination 
makes sense for your build system.

Listing all missing packages at once would be a great convenience and give a 
better first impression.

> Well, you could reverse the argument by saying I can't impose him/her to
> install stuff they don't need. But in that case they won't be able to use
> crosstool-NG. Unless they come up with a change-request and a proper patch.
>
> After all, if you want to compile some software, you'll need a compiler.
> And if you don't want to install that, then you won't be able to compile
> your software.

You do have to start somewhere.

> Regards,
> Yann E. MORIN.

Rob
-- 
GPLv3 is to GPLv2 what Attack of the Clones is to The Empire Strikes Back.

--
For unsubscribe information see http://sourceware.org/lists.html#faq



More information about the crossgcc mailing list