I try to unload all my tables with MSXML4.0 via a Automation Variable.
Generally (with Cronus) it works - i got my xml structures as I want.
But in my database with big tables like T32 I have got my troubles. It's can't handle them in once so I have to splitt them into blocks with 15000 records and save them step after step. Even this concepts works.
My problem now is that after a save and cleanup of all varables Navision want give me back all used memory. So in every turnaround Navision keps a part of the used memory until it came a message the navision has not enough memory to proceed on the task.
As had any of you a similar problem an how does you solved it?? Or have you any idea for a solution.
Thank's Christoph
0
Comments
One thing you try can try is to change the variables from local into global - or vice versa. It helped in earlier versions.
Since you don't need access to the entire XML tree when streaming table data to disk, why not consider using the XMLSax object model, which has a much lower memory footprint.
Here's a little toy to demonstrate (the code compiles and runs, but I've removed a bunch of stuff to make this readable):
Maybe you are using locals instead of globals. (was a issue back in the old days, before attain)
Maybe you are using record references, there is a leak bug in there somewhere.
But fb seems to have a good point!
If it was hard to write, it should be hard to understand."
my variables are as much as possible local..
yes I using record references.. I wonna have a look to try to solve it in an other way..
I wonna give some feedback, when I've got any new Infos..
Thank you all for your help
Christoph