This is the mail archive of the pthreads-win32@sourceware.org mailing list for the pthreas-win32 project.


Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]
Other format: [Raw text]

semaphores and handle leaks


Hello all,

I've spent the last couple of days redesigning part of my application to work around what seems like a handle leak when using semaphores. With my previous design they were leaking very rapidly. With my new design it is much slower but still troubling. I'll give the gist of my application here so if I'm doing something obviously wrong maybe one of you can point it out to me. Then I'll go back to trying to make a small sample program which exhibits the bug.

My application is a DLL to be called from a LabVIEW application or from a C or C++ test program.

I'm using GCC and MinGW32:

L:\cpp\FrontEndControl2>g++ -v
Reading specs from C:/system/mingw/bin/../lib/gcc/mingw32/3.4.2/specs
Configured with: ../gcc/configure --with-gcc --with-gnu-ld --with-gnu-as --host=mingw32 --target=mingw32 --prefix=/mingw --enable-threads --disable-nls --enable-languages=c,c++,f77,ada,objc,java --disable-win32-registry --disable-shared --enable-sjlj-exceptions --enable-libgcj --disable-java-awt --without-x --enable-java-gc=boehm --disable-libgcj-debug --enable-interpreter --enable-hash-synchronization --enable-libstdcxx-debug
Thread model: win32
gcc version 3.4.2 (mingw-special)


I've got the lastest pthreadGC2.dll, and libpthreadGC2.a from http://sources.redhat.com/pthreads-win32/

There are several "monitor" threads. Each one creates semaphores on the stack, around 5-15 times every 100 ms:

sem_t synchLock;
sem_init(&synchLock, 0, 0);
monitor(AMBSI_TEMPERATURE, dataLength, data, &synchLock, &timestamp, &status);
sem_wait(&synchLock);
sem_destroy(&synchLock);


The pointer to the semaphore is put along with other data in a queue.

In my original design, a new thread would be launched for every item in the queue. These threads would save the pointer to the caller's semaphore, create a new one on the local stack, substitute it for the caller's, and then add the data to a queue for transmission on a CAN bus. Once it has been sent, the CAN bus handler will sem_post:

// Create a unique semaphore sem2:
sem_t sem2;
sem_init(&sem2, 0, 0);
// Substitute sem2 for the sempaphore in the caller's completion structure:
sem_t *sem1 = msg.completion_p -> synchLock_p;
msg.completion_p -> synchLock_p = &sem2;


// Send the message:
canBus_mp -> sendMessage(msg);
// Wait on sem2:
sem_wait(&sem2);
sem_destroy(&sem2);


// [ make a local copy of the data]

// Put back the caller's semaphore, if any, and signal on it: msg.completion_p -> synchLock_p = sem1;
if (sem1)
sem_post(sem1);


// [ log the transaction ]

The idea here is that this thread can take all the time it needs to log the transaction to a database without holding up the caller's thread. As I said, this design was leaking handles at a rapid clip. It seemed like 2 handles per message were leaked -- hundreds every second. Using gdb I traced the leaks to happening in sem_init calls.

Since creating all those threads was kind of a dumb design, I've changed it to a more conventional design. Now instead of one thread per message, there is a single worker thread and a circular buffer for holding the messages. It still is working in basically the same way, though. A fixed number of semaphores are preallocated and sem_init-ed by in the buffer. These are substituted for the caller's semaphore as above.

This design still leaks handles, only much slower. At full load of >300 messages / second, it leaks 6 to 10 handles per second. At a reduced load of 15 messages every 5 seconds, it leaks 2 handles every 30 seconds or so.

Does anything jump out as being wrong that I'm doing? I'll try to get a simple test program that demonstrates this sometime in the next few days.

Thanks for your consideration.

-Morgan McLeod
Software Engineer
National Radio Astronomy Observatory
Charlottesville, Va



Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]