Break at address on darwin

Jeffrey Walton
Tue Aug 2 19:00:00 GMT 2011

Hi Ben,

On Tue, Aug 2, 2011 at 12:46 PM, Ben L. Titzer <> wrote:
> I am generating very simple Mach-O binaries by hand without symbol
> information and trying to debug them with gdb by setting breakpoints
> at various addresses. However, the breakpoints I set do not fire,
> though I am certain those addresses are being executed (program runs
> to completion, I can put in illegal instructions and they trap in gdb,
> my program makes system calls that output to stdout, etc).
Are you certain you are setting a breakpoint on an address (eg, b
0x40000000)? Or is it a symbolic name (b main)?

> When I debug other binaries (e.g. generated by gcc), I am able to set
> breakpoints at various addresses and they fire in gdb no problem.
> Even though my binaries load and run correctly, producing the correct
> output, gdb breakpoints don't work. If I explicitly insert an int3
> instruction, a gdb breakpoint does occur.
> I have a feeling that I am missing some step that is required by gdb,
> such as setting an attribute or adding an extra section to my binary,
> but I don't know what.
> [SNIP]
When I have issues, it typically is because I have optimizations
enabled and I set a symbolic breakpoint which is never hit (despite
what the result of the break command states). Taking optimizations
back to -O0 usually resolves the issue for me.

For what its worth, I don't find this sort of thing to be an Apple/Mac
OS X issue. It can happen anywhere the optimizer works its magic.


More information about the Gdb mailing list