Leading character for BINARY format ?
Ian Lance Taylor
ian@zembu.com
Mon Apr 17 11:22:00 GMT 2000
Date: Mon, 17 Apr 2000 11:15:44 -0700
From: Nick Clifton <nickc@cygnus.com>
A customer reported a problem when using OBJCOPY to convert from
BINARY to ELF formats. The problem was that the target ELF format
did not use an underscore prefix for user symbols, but the symbols
generated by the BINARY code did. As a result they ended up with an
ELF executable with user symbols that had an underscore prefix.
Using the '--remove-leading-char' option to objcopy did not work
because the binary format does not define a leading character.
The patch below solves this problem for the customer, but I am not
entirely sure if might not have some unexpected side effects. All
that the patch does is to change the specification of the leading
character for the BINARY format from nothing to an underscore, so
that converting from BINARY to (ARM-) ELF with --remove-leading-char
will now strip the first character.
I don't wholly understand. What does it mean to say that the symbols
generated by the binary code use an underscore prefix? No matter what
the symbols look like, the user should be able to refer to them
directly, using an appropriate number of underscores, whatever that
number is. I don't think it is wrong for BFD to use leading
underscores for these symbols, since they are system generated. I
don't understand why anything has to change in BFD.
There is a known problem that the symbols appear differently when
viewed from a.out C files and ELF C files, in that a.out C files
expect a leading underscore and ELF C files do not. That means that
the same code doesn't work in both a.out and ELF, and the user must
use some sort of #ifdef. Is that the issue here?
Ian
More information about the Binutils
mailing list