Memory consumption in XSL Transformation

E

Edouard Mercier

Hi to all;

I would like to have the confirmation on an intuition. I'm working on
a XSL that takes an XML file which contains about 30.000 XML elements
(let's call them A elements), each element defining an URI to an XML
file.

In my XSL, I read A-element by A-element, read the URI it contains,
and then analyze the XML file corresponding to the URI via the
'document()' XPath directive (let's name it the B document). This
B-document analyzis is basic and I do not use the B-document anymore,
once the A-element iteration is over.

I hope that I was clear enough...

My XSLT implementations are Saxon V6.X, Instant Saxon V6.X, and Xalan
V2.5: in all those implementations, the RAM consumption when running
the transformation is increasing that much that I run out of memory
after an about 10.000 A-elements handling.

What I guess, in that the read XML stuff via the 'document()' is never
freed by the transformer? Am i right?

How can I do in order to "prompt/force" the transformer to garbage the
read intermediate B-documents?

Thank you very much for your attention.

Regards,

Edouard
 
E

Edouard Mercier

My guess was right: it seems that whenever you use the 'document()'
directive, the memory used for the parsing is never recovered later
on. Am I right ?

Do you know a way of garbaging this memory? Your help is much
appreciated.

Regards,

Edouard
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,744
Messages
2,569,483
Members
44,901
Latest member
Noble71S45

Latest Threads

Top