GDB does not understand well the bit field. ahalder@ahalder-VirtualBox:~/test$ gdb ./a.out GNU gdb (GDB) 7.3.50.20110921-cvs Copyright (C) 2011 Free Software Foundation, Inc. License GPLv3+: GNU GPL version 3 or later <http://gnu.org/licenses/gpl.html> This is free software: you are free to change and redistribute it. There is NO WARRANTY, to the extent permitted by law. Type "show copying" and "show warranty" for details. This GDB was configured as "i686-pc-linux-gnu". For bug reporting instructions, please see: <http://www.gnu.org/software/gdb/bugs/>... Reading symbols from /home/ahalder/test/a.out...done. (gdb) b main Breakpoint 1 at 0x804839a: file test.c, line 23. (gdb) r Starting program: /home/ahalder/test/a.out Breakpoint 1, main (argc=1, argv=0xbffff474) at test.c:23 23 return (SUCCESS); (gdb) ptype bf type = struct BitField_t { unsigned int f1 : 1; unsigned int f7 : 7; } (gdb) ptype bf.f1 type = unsigned int (gdb) ptype bf.f7 type = unsigned int (gdb) p sizeof(bf) $1 = 4 (gdb) p sizeof(bf.f1) $2 = 4 (gdb) p sizeof(bf.f7) $3 = 4 (gdb)
I'm not sure this is really a bug. In C at least, bitfields have a funny role in the type system. I'm not sure they really have their "own" type, and here gdb tries to model what C does.