our Warehouse Entry table has 6.1M records, taking up 2.5GB in our DB. I'd like to compress the entries by year through 12/31/2010.
unfortunately, the process is very slow going. usually when I see something like this, I assume the report I'm running has a bad key or something.
But the logic in the batch report is a little over my head, the way it is based on Warehouse Entry (without a specified key), but then the first statement of Warehouse Entry OnAfterGetRecord is WhseEntry2 := "Warehouse Entry";
At that point all the logic is based on WhseEntry2, and the report sets an appropriate key to do the compression (starting with Item No., Bin No.), but then it still works very slowly.
When I say "slowly" I mean that I am seeing the report run through a couple hundred items numbers without issue (based on the new/deleted numbers climbing higher), but then the report stalls anywhere from 2 to 10+ minutes.
I am doing this when I am the only user on the system. I am doing one year at a time (i.e. I did 1/1/04..12/31/04 by itself, then 1/1/05..12/31/05 by itself, etc.). Up through 2008, we had up to 10,000 records per year. But starting in 2009, we have 1,000,000+ records per year. At the current pace that the 2009 compression is moving, I estimate this taking 50-75 hours to complete.
Am I doing the best I can do? or is there something wrong with the compression report that others have also experienced (and hopfully have addressed)?
thank you for your thoughts and ideas.
0
Comments
This is not to say I've never "compressed" a Warehouse Entry table. I've just taken a more manual approach. That is to abandon all the data.
1. Pick a cutoff point where you have no open warehouse activity.
2. Reclass the inventory to an empty non-bin location.
3. Clear the Warehouse Entry table
4. Reclass the inventory back.
we have about a 60GB database. we've identified about 20GB of old data that isn't necessary among about 12 tables. Warehouse Entry table is one of those 12 that we're trying to purge old data.
G/L compression has worked flawlessly. Deleting posted sales and purchase documents is working well too (but the only way I've figured out how to do this efficiently is just to run object designer, open up the header table, select the records I want to purge, and just delete from there. I'm guessing I'm missing a batch report or something that should allow this, but I can't find it.) Also we've purged old warehouse registration documents.
we're about 75% the way to our goal. but this Warehouse Entry table isn't playing nice like the others.
thanks for your help.
In response to your question, I've done it both ways. But it's not something I've done a lot of times.
we're doing a DR audit. and we want a smaller database to speed up a restore if necessary. also we hope to give the users a small performance boost (especially those who know a little too much to be dangerous and do searches and filters using the client on these tables without respecting a key ... that includes me sometimes too :oops: ).
one lookup we do often when looking up bin history to resolve a problem is to bring up all the bin history for an item, for all bins in a location, sorted by entry no. This is performing worse and worse, and I hope purging old records will make this go faster for the users. I guess we could add a new key... although that doesn't solve our DR objectives.
Deleting data should not be your first solution to performance issues.
In terms of users picking wrong keys, don't use technology to replace user training.
Independent Consultant/Developer
blog: https://dynamicsuser.net/nav/b/ara3n
That may be a valid solution for that table, but is it valid for the other tables? Also what happens if transaction volume increases, do you just start deleting more often? I still say that, with a DB this small while there may be table specific exceptions (like warehouse entries), deleting data should not be the first generic solution for performance issues.
Independent Consultant/Developer
blog: https://dynamicsuser.net/nav/b/ara3n
I don't disagree. What's I'm talkign about is the main stated reason driving this need. If the DR plan is only viable if the DB is maintained at something below 60 GB, then I say the DR plan needs to be rethought.
Compression has its place, and yes you are looking at the simple ones, but that may lead you to get over confident and move into the dangerous territory which should have a "Do Not Try This At Home" disclaimer attached.