Should GCC tell GDB about its optimizations?
Stan Shebs
shebs@apple.com
Fri Mar 3 15:58:00 GMT 2000
I'm sure this subject has come up before, but I don't remember any
consensus...
In the process of finishing Mac OS X, Apple has hundreds of engineers
beating on GCC and GDB every day. The system has literally hundreds
of subcomponents; everything from the kernel to speech output to Hebrew
font support (sort of like a Linux system preloaded with everything
you can find on freshmeat :-) ). By default, everything is built with
optimization turned on. While an engineer can turn off optimization
for a specific bit of code, it's not practical to build an entire system
with optimization turned off. So in practice any single program consists
of a combination of optimized and unoptimized code.
Ideally of course, GCC would issue lots of amazingly detailed debug info,
and GDB would use it to reconstruct and report program state just as the
programmer expects to see it. But today, the result is just lame; hackers
trying to debug get lots of squirrelly behavior from GDB. The problem is
that they don't know whether the randomness is due to bugs in the program,
or to the effect of the optimizer. So the suggestion came up to have GCC
issue debug info stating what optimizations have been applied to a file,
and to have GDB report that information per-function, so that users could
lower their expectations appropriately.
Although my first reaction was to say "bleah", upon reflection I wondered
if I was too easily dismissing the concerns of real users of the tools.
This sort of thing routinely confuses users; even with years of GNU
experience, I still find myself being caught by surprise because I've
forgotten that part of the code was optimized. A simple warning from the
debugger would have saved me some time and trouble.
If there's consensus that this would be a worthwhile addition, I'll
work up a specific design and publish it.
Stan
More information about the Gdb
mailing list