I have a client whose change log has basically gotten out of control. There are currently 300,000 K entries. The client is running 4.0 SP1 on C/Side. We are trying to delete some of the older entries as this is, obviously, resulting in performance issues. The problem is that with so many entries the report to delete change log entries seems to take so long that it runs into their nightly processing (nightly imports from an offline system).
I noticed that there are two keys on the table: a) Entry No. and b) Table No., Date/Time. I took a copy of the database and disabled the second key. This enabled me to delete entries faster, but only when I knew the Entry No. range that I wanted to delete. I would do this in the live database, but am not sure that disabling that second key will not result in further issues.
Any suggestions as to how to tackle this? Many thanks.
0
Comments
If you are trying to delete entries for a whole year, let' say, then try to delete entries for the first month, then the second etc...
In your case I would add a counter and every 10,000 or so records do a commit when deleting. The table is not consistent and is not linked anywhere, so losing a few wrong ones would not be an issue.
I thought my 9,5 million records was a lot :!:
In addition, if I use the report to delete the entries, it begins counting records using both keys running through the first and then the second. If there is some way that I could avoid this, I would appreciate the input.
Thanks.
Actually here it would help you if you had a key with the date.
But this "record counting" appears while the system applies the filter. Once it has applied them it shouldn't take too long to delete them, if you specify smaller date intervals...
By the way, how many records do you have?
It should be similar to the posting date since entries are entered in chronological order.
AP Commerce, Inc. = where I work
Getting Started with Dynamics NAV 2013 Application Development = my book
Implementing Microsoft Dynamics NAV - 3rd Edition = my 2nd book
He already said that, 300,000,000 which is huge.
Just wanted to make sure DWOpdahl with
really means 300,000,000 and it's not a typo meaning actually 300,000.
It would be really weird to have 300000 entries taking too long to delete but I was just curious to clear that out.
Exactly which is why I assumed that he meant what he posted. If he has 300k entries and its slow then there is a serious issue.