Can Cygwin's gettimeofday() count to 30us resolution instead of 10ms?

Wong, Homer
Tue Mar 21 21:59:00 GMT 2000


Can Cygwin's gettimeofday() measure down to Linux's 30us (microsecond) timer
resolution instead of Windows' 10 ms (millisecond) timer resolution? I used
the following program to test gettimeofday():

#include <time.h>
#include <stdio.h>
#include <sys/time.h>

int main()
        struct timeval tv;

        int i;
        for (i=0; i<1000;i++)
                gettimeofday(&tv, (struct timezone *) 0);
                printf("%lf \n", tv.tv_sec * 1000000.0 + tv.tv_usec );

When I compiled it using Cygwin B20.1 on a Pentium II ??? Windows NT box,
the (annotated) output was:

217722273000.000000 (repeated ??? times)
217722283000.000000 (repeated 287 times)
217722293000.000000 (repeated 287 times)
217722303000.000000 (repeated 287 times)
217722313000.000000 (repeated ??? times)

Basically, it was only measuring at 273ms, 283ms, 293ms i.e. about every

When I compiled it using gcc- on a Pentium II ??? Red Hat Linux 4.2
box, I got 1000 uniq time values e.g.:

This was much better, measuring at 374.555ms, 374.954ms, 374.986ms,
375.010ms i.e. about every 30us.

The Linux code for gettimeofday() is in
/usr/src/linux/arch/i386/kernel/time.c or
l#250 or .

Homer Wong,
Telstra, Australia

Want to unsubscribe from this list?
Send a message to

More information about the Cygwin mailing list