[Bug python/12533] New: gdb consumes large memory when millions gdb.values are created
joachim.protze at zih dot tu-dresden.de
sourceware-bugzilla@sourceware.org
Wed Mar 2 11:36:00 GMT 2011
http://sourceware.org/bugzilla/show_bug.cgi?id=12533
Summary: gdb consumes large memory when millions gdb.values are
created
Product: gdb
Version: HEAD
Status: NEW
Severity: normal
Priority: P2
Component: python
AssignedTo: unassigned@sourceware.org
ReportedBy: joachim.protze@zih.tu-dresden.de
For my current pretty-printer project i collect some metadata in inferior
space. This is used to evaluate, how some values - especially send/recv-buffers
- should be formated for printout. In some cases it is necessary to evaluate
metadata for each single value in buffer, which can be 10k+ values.
I observed gdb to consume hundreds of MB memory and tried to analyze the source
of the problem with valgrind + massif: most memory is consumed within
allocate_value()
What i don't understand: why are the value structures not freed while the
command is running. Is this an issue of the python interpreter or is this an
issue of the gdb-py api?
Is there a way to get the values freed?
-----------------------------------------
short example to see this behaviour (for RAM <2GB use smaller iteration
counts!):
(gdb) python v=gdb.parse_and_eval("1")
(gdb) python
>for i in range(1000000):
> v+=1
>end
(gdb) python
>for i in range(1200000):
> v+=1
>end
(gdb)
gdb consumes 600MB after the 1.0M iterations and 780MB after the 1.2M
iterations
--
Configure bugmail: http://sourceware.org/bugzilla/userprefs.cgi?tab=email
------- You are receiving this mail because: -------
You are on the CC list for the bug.
More information about the Gdb-prs
mailing list