slow handling of large sets of files?

Dave Korn
Mon Jan 10 15:11:00 GMT 2005

> -----Original Message-----
> From: cygwin-owner On Behalf Of Lester Ingber
> Sent: 10 January 2005 14:36

> I don't understand why handling large data sets is so slow on 
> my Cygwin?
> I have had this problem as far back as I can remember with Cygwin.
> Running executables prepared under gcc (without large file 
> I/O) seem fine.
> Below I include just the top part of `cygcheck -s`.
> To be specific, I have tarred-gzipped directory of about 5300 files,
> % ls -l DATA.tar.gz 
> [...] 92303 Jan 10 05:27 DATA.tar.gz
> which can be retrieved using a browser or just `wget` from
> I tested the following commands
> (1) gzcat DATA.tar.gz | tar xBpf -
> (2) rm -rf DATA

  Can't reproduce.

DKAdmin@ubik ~> time gunzip -c DATA.tar.gz | tar xBpf -

real    0m5.242s
user    0m1.101s
sys     0m3.144s
DKAdmin@ubik ~> time rm -rf DATA

real    0m2.947s
user    0m0.620s
sys     0m2.133s
DKAdmin@ubik ~> time tar xfzp DATA.tar.gz

real    0m6.453s
user    0m1.051s
sys     0m3.635s
DKAdmin@ubik ~> time rm -rf DATA

real    0m2.890s
user    0m0.520s
sys     0m2.163s
DKAdmin@ubik ~>

  You _sure_ you aren't accidentally doing this on a remote network drive?

Can't think of a witty .sigline today....

Unsubscribe info:
Problem reports:

More information about the Cygwin mailing list