getting memory errors when natively bulding native gnu gcc 3.3.2 compiler on PQ2FADS-VR with 32 MB DRAM memory

Kai Ruottu
Tue Oct 19 07:34:00 GMT 2004

Lennert Buytenhek wrote:

> On Sun, Oct 17, 2004 at 11:19:51AM -0400, Povolotsky, Alexander wrote:
>>Out of Memory: Killed process 9807 (cc1).
>>powerpc-linux-gcc-3.3.2: Internal error: Terminated (program cc1)
> You ran out of memory, and the kernel had to kill your cc1 process.
> Nothing you can do about that really, apart from adding memory or
> adding swap.

  The amount of memory required for running software seems to follow something
like "The Law of More"... In early 80's 64 kbytes was enough, in late 80's 640
kbytes, in early 90's 6.4 Mbytes, in late 90's 64 Mbytes and now in early 2000's
this seems to be 640 Mbytes...

  Using 'top -b' during the GCC-build and then grep'ing the 'cc1' process values
from the logfile, then sort'ing it, I got the following max memory needs for the
gcc-3.2.3 when building the gcc-3.2.3 sources :

  3747 root      25   0 24436  23M  3856 R    99,6  4,7   1:31 cc1
  3747 root      25   0 24012  23M  3744 R    99,6  4,6   1:21 cc1
  3747 root      25   0 23032  22M  3796 R    99,6  4,4   1:26 cc1
  3747 root      25   0 22804  22M  3744 R    99,5  4,4   1:16 cc1

  So 23 Mbytes was required for this compile. Meanwhile for the newer gcc-3.3.5
when compiling the gcc-3.3.5 sources I got something much bigger :

26227 root      25   0  114M 114M  3804 R    99,3 22,7   1:25 cc1
26227 root      25   0  113M 113M  3852 R    99,4 22,6   1:30 cc1
26227 root      25   0  113M 113M  3776 R    99,4 22,6   1:20 cc1
26227 root      25   0  108M 108M  3688 R    99,3 21,6   1:15 cc1

  This was tested with a RedHat 7.3 target cross-toolchain being for the host and
'i486-linux-gnu' being the target, on my RedHat 8.0 build system.

  So, with gcc-3.2.x having 64 Mbytes or even only 32 Mbytes on the build system
could be enough, but this isn't the case with gcc-3.3.x any more. One can only
try to guess what the gcc-3.4.x and the under-construction gcc-4.x could then 
require... Maybe the current 512 Mbytes in my build system isn't enough.

  If one thinks gcc-3.2.x being good enough, then maybe something can be
done with this issue...  Is there some serious problem with gcc-3.2.x and
Linux/PPC ?  The RedHat 8.0 and AFAIK the RedHat 9.0 came with gcc-3.2.x,
maybe SuSE 9.0 etc., so with x86 there seemingly was no problems....

  If one has only 32 or 64 Mbytes on the target board but wants to try building 
GCC natively, maybe the general advice would be to forget gcc-3.3.x and newer
GCCs and use some earlier GCC with its much smaller memory requirements...
I really had suspected something like this when things suddenly didn't seem to
work on the old 64 Mbyte PCs... In late 80's having 5 + 8 = 13 Mbytes in a VAX
was really much memory and having 64 Mbytes in a PC and so being capable to do
things one could earlier do only in mainframes, was something which could be
told in news (some Finnish research organization for agriculture moved their
statistical jobs to a PC from a mainframe...). But is this current or "Really
Soon Now" need for 640 Mbytes memory for building GCC really what the GCC users
will want, or only what the memory makers will want?

  I really thought it should still be possible to build (in reasonable time) GCCs
in those 5 - 10 years old PCs with only 64 Mbyte memory (4 x 16 MB SIMM)...

Cheers, Kai

Want more information?  See the CrossGCC FAQ,
Want to unsubscribe? Send a note to

More information about the crossgcc mailing list