Archiving Item Ledgers (yes, again about that!!!)

cnicolacnicola Member Posts: 181
A couple of my clients have some rather big Value Entry tables. One of them is 61 million entries after 3 years of use (total db is 355GB).
Day to day performance is acceptable and reporting is fine also (except Inventory Valuation which takes 12+ hours to run). So I definitely do not want to make this into a performance thread (They are using SQL 2005, Navision tuning has been done, server is powerful so I think we are good there).
The problem is Adjust Cost. They are using 3.70 so I had to modify it to allow for Item filtering but even so a full item cycle takes at least 1 week (running it as a batch job at night). And if they do a revaluation to correct a mistake it takes even longer (they have about 60000 items).
I have tried a lot of ideas to improve the Adjust Cost speed but without a lot of luck (it is pretty weird code and it is not something you want to rewrite).

Anyway given all of the above the only solution left seems to be reducing the size of the Value Entry table (that will help with Inventory Valuation as well).
The obvious option was the Compress Item Ledger routine but everyone seems to think it will create problems though nobody can say specifically what, when and why. I was planning to run it by day for first 2 years (that are closed). But given everything I read about the routine on these forums it seems that is not a good/preferred option. ](*,)

The other idea I had was the following:

1. As of a certain cutoff date calculate open inventory and then zero out that entire inventory (with negative adjustments).
2. Move all item ledger related data to some other new tables (Item Ledger, Value Entry, Item Application Entry).
3. Restore opening balances with quantity and cost as of that date (with positive adjustments).
4. Hopefully continue operating as usual but with a lot less entries to deal with [-o<
5. Change reports to take into consideration the new tables.

I realize it is a rather drastic approach so I guess my question is whether you see any issues with the above process or whether you can think of any possible side effects.

In theory 5.0 and 2009 costing routines are more streamlined but with the current size of those tables and the idiotic (in my humble opinion :oops: ) new addition of the link between posted docs and item ledger/value entries I doubt we would be able to upgrade a db of this size within even a 4 day weekend.

P.S. To avoide rewritting the reports, one variant on the idea would be to post back a summarized version of the transactions (say one entry per month, per customer/vendor, per entry type and whatever else may be relevant).
Apathy is on the rise but nobody seems to care.

Comments

  • TonyHTonyH Member Posts: 223
    Are all the items still in use? (Could you delete a subset?)
    If your unsure about the compression routine, try it on a test database somewhere? I think the concern is that the compress routines are created with std NAV in mind with zero mods, so as soon as you put a mod in the system what are the ramifications? Every mod is different so its hard to give a definitive answer.

    Another "Drastic" approach could be like initial implementation. Delete the whole lot and start with opening balances... Old Posting will no longer link (Navigate etc...), and it is VERY drastic... but theoretically it would start you with a clean slate and speed up the ACIE.

    But all these are medium to short term fixes for a problem that WILL come around again....

    You need a solution not a band aid.

    My thoughts would be to modify the ACIE.
    Allow it to run batches of items at a time.
    Allow those batches to only be populated with Items that have had a ILE/VE "change" since the last time you ran the ACIE for that item... Hopefully that should mean the ACIE is only doing necessary work.

    t
  • bbrownbbrown Member Posts: 3,268
    Personally I avoid the date compress routines like the plague. I've just seen too many issues from them.
    There are no bugs - only undocumented features.
  • David_SingletonDavid_Singleton Member Posts: 5,479
    The compression routines simply don't work. Forget them. Even if you did run them, they would have virtually no affect at least as you need.

    So you need to do this your self.

    The reason for all the warnings, is that often people do this without enough experience. I have done ILE and VLE compressions for clients, but using procedure built specifically for their implementation. You have enough experience and have been doing this long enough to do it, the thing is the cost. There is no way you can fully test this, so its going to be ongoing and the customer has to be aware of this fact.

    The ideas you have are things I have tried and they do work. You have to be ready to delete the Item Ledger and Value Ledger entries and know what repercussions this will have.

    But step one is to make a fake database and bench mark just how much faster it will be. Your client will not be happy to pay for 2 months work, only to find that the speed only doubles, they will want adjust cost to go from 1 week to 1 hour.

    Personally though I would just throw away the costing engine in its entirely and write a custom one, it will be cheaper and more reliable.
    David Singleton
  • David_SingletonDavid_Singleton Member Posts: 5,479
    By the way I have a client on 3.60 objects / 4.00sp3 executables with a much bigger database and they run cost adjust in a few hours each night. 1 week as batch does sound excessive unless its manufacturing.
    David Singleton
  • abhaydalal8722abhaydalal8722 Member Posts: 27
    Personally though I would just throw away the costing engine in its entirely and write a custom one, it will be cheaper and more reliable.

    I think so too, but has it been tried. Also, is costing engine made by microsoft whole and sole....Has no vertical or addon has been made for this?
  • David_SingletonDavid_Singleton Member Posts: 5,479
    Personally though I would just throw away the costing engine in its entirely and write a custom one, it will be cheaper and more reliable.

    I think so too, but has it been tried. Also, is costing engine made by microsoft whole and sole....Has no vertical or addon has been made for this?

    Hynek Muhbacher and I wrote an Add-On that did this. After 3.xx though and value entries there was no market because it was similar to the way value entries work, except revaluation was much simpler.

    The thing is that the costing engine in Navision does everything, and therefore does too much. If you know the business you can trim it down and write a custom costing engine that only does what you want. Allowing retrospective inventory value reports to any date; Specific costing by serial number; ability to back date cost adjustments to a closed period; Average cost; these things and more mean that every customer is carrying necessary baggage on every inventory transaction.

    Unfortunately the marketing team at Microsoft when asked about Navision in large implementations, will simply say "That's what Axapta is for", so there really is not the motivation to "fix" a problem that they feel they already have a solution to.

    Oh and I hear people saying "So why don't I write a better one" the answer is simply that people wont pay for it.
    David Singleton
  • cnicolacnicola Member Posts: 181
    Couple of answers:

    @ TonyH: The routine is meant to be reused (probably every 2-3 years or more often if we get confident about it) so it would not be a band aid . Deleting items does not delete ILEs as far as I know. And adjust cost does only update the items that have had movement since last time the routine was run.
    @ David: While my client might pay for it, I do not want to write my own costing routine (or have someone else do it) for the same reason I am reluctant to try the compression routine: how do I know if it works or not? It may seem to work fine at the beginning but you don't may not see the issues for a while. And with average costing (which is what my clients are using) it could simply generate wrong(ish) numbers. If you are right things will be better (maybe). If you are wrong then we will be out of business and the client will also. [-o<
    Also regarding db size and performance: They are distributors for products from China. They use average cost and they post on average 100k -150k sales lines / week (and double or more during busy season). Given that we use average cost and also Item Charges on the purchases, that creates a decent number of ledger entries to update.

    Anyway, as I said at the beginning I was wondering if anyone sees anything wrong with wiping out the slate and starting clean.
    As an alternative, I like the second idea of basically doing my own compression by taking out the entries and putting in a summary.
    Apathy is on the rise but nobody seems to care.
  • TonyHTonyH Member Posts: 223
    cnicola wrote:
    Couple of answers:

    @ TonyH: And adjust cost does only update the items that have had movement since last time the routine was run.

    Correct. It only "Updates" items that have had a movement... but the routine is also used to find out what items to update.

    Since you can set this at the time a "Movement" occurs you can limit the ACIE to concentrate on only the items that have had a movement. Instead of allowing the ACIE to run through the ILE to figure out if one of your 6000 items need adjustments or not.

    t
  • dspasovdspasov Member Posts: 33
    Hello,
    We also had very big DB.We do not do costing because we are 3PL company but bigger Item Ledger Entry, Value Entry and Warehouse Entry was a problem because of reporting to our customers.
    What we do:
    Regullary start batch wich finds nonactive items and move the related rows in Item Ledger Entries, Item Register,Warehouse Entries,Warehouse Register,Item Application Entry,Item Entry Relation, Whse. Item Entry Relation, Tracking Specification,Registered Whse. Activity Hdr.,Registered Whse. Activity Line to mirror archive DB.
    Value Entry i`m deleting it up to date.
    In that way i reduced the DB from 300GB to 30GB
Sign In or Register to comment.