Hi
I need some help.
I have a DB in 2.5/2.6 and I upgrade the objects to 4.0
Now I need to move data from old database to the new database.
I need to move 20 000 000 of records that are in table 17.
So, I created a temp table just like table 17 in the new database, then with a DTS I put some record in the temp table, then whit a codeunit I went to get the records of the temp table and inserted the records in the table 17.
I spend 5 days to insert 4 000 000 of record in the new database and it is still running.
What I want to know is if there is anything that I can do to make the process run faster.
Thanks,
Dina
0
Comments
On the installation cd there should be a dir named \UPGTK\Doc with info about additional tasks to eg. fill dimension tables and other new stuff in 4.0
http://www.mibuso.com/dlinfo.asp?FileID=503
No, I'm not upgrading the Navision way. I had talk ed with some one in navision, and for this especial case it was said to not do it.
What I can think of though is that you may have to streamline your code a bit, put some COMMIT statements here and there to free up memory. Are you running the process on the server directly or on a client machine? If you are running it on a client, then you are pulling 20000000 records through the network to your client computer (I doubt that you have enough RAM for that, so it's probably paging like crazy) and pushing 20000000 other ones back to the server.
RIS Plus, LLC
The DTS put some (note all the 20000000 records) in table Temp17 (a table equal to table 17)
Then I run a codeunit:
Temp17.RESET
IF Temp17.FIND('-') THEN
REPEAT
T17 := Temp17;
T17.INSERT(TRUE);
COMMIT;
UNTIL Temp17.NEXT = 0;
When finish, I TRUNC the Temp17 in SQL to clean all the records, and then I put more record (4000000) with the DTS and run the codeunit again.
I going to try disable the secondary key and put less records to see if it improve performance.
You may want to code commit points around that, so you don't COMMIT after every insert but after say every 500 or every 1000 records. COMMIT itself takes up processing power itself, and by only doing it every so many records you can streamline the process.
Disabling secondary kays may speed things up as well. Don't forget to turn them back on when you are done though
RIS Plus, LLC
Disabling secondary key work perfectly. One week to insert 4.000.000 records, 1 hour to insert 3.000.000 .
Let see what happen when I turn them back again.
Now I can't insert a record in table 17. It stop, I stay half an hour to insert a record. What is happen????????
](*,) ](*,) ](*,)