This is the mail archive of the xsl-list@mulberrytech.com mailing list .


Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]

RE: Improving performance for huge files with XSLT


Yes, it's a problem. Every XSLT processor builds a tree of the source
document in memory, this tree will often occupy about 100 bytes per node in
the document.

One solution is to write a SAX filter to subset the data on the way in to
the XSLT processor, if you only need to access part of it.

Saxon has an extension <saxon:preview>, which processes the document one
subtree at a time. This is rather messy but it can sometimes help.

Sebastian Rahtz has published some performance comparisons for various
processors on large XML files.

Mike Kay

> -----Original Message-----
> From: Ornella Piva [mailto:Ornella.Piva@HTA.nl]
> Sent: 13 September 2000 08:14
> To: XSL-List@mulberrytech.com
> Subject: Improving performance for huge files with XSLT
> 
> 
> Hi,
> I'm using XSLT to convert xml files into other xml files. I have to
> convert huge xml files (containing, e.g. 50000/100000 nodes), but the
> perfomance is becoming a real problem: it takes more or less 
> 20 minutes
> to convert a file with 100000 nodes.
> Are there some general methods to improve the performance 
> with huge xml
> files? Did somebody encounter the same problem? How did you solve it?
> 
> Thanks,
> Ornella Piva
> 
> 
>  XSL-List info and archive:  http://www.mulberrytech.com/xsl/xsl-list
> 


 XSL-List info and archive:  http://www.mulberrytech.com/xsl/xsl-list

Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]