General Ledger - Calculate Detailed Balance works too slow

DeSpDeSp Member Posts: 105
We use Navision 3.70A SQL Server Option.
On the G/L Corresp. General Ledger form we have a checkbox called Calculate Detailed Balance. This calculation runs very very slow - too slow. It is a standard code and we haven't done any customizations there.
I tried to play with SIFTLevelsToMaintain property of the keys in G/L Entry table but with no any significant result.
So what is the cure? How can we boost the performance there?

Any suggestions would be very much appreciated.
Thanks in advance.
Nil desperandum

Comments

  • kinekine Member Posts: 12,562
    1) You can use Client Monitor and the Extended client monitor objects for finding what is too slow. (bad used key, cache is not used correctly, full scans etc.)
    2) You can monitor HW to see if there are some bottlenecks
    Kamil Sacek
    MVP - Dynamics NAV
    My BLOG
    NAVERTICA a.s.
  • DeSpDeSp Member Posts: 105
    I use a Client Monitor toolset. Client Monitor (Key usage) and Client Monitor (Cash Usage) show no records - so it is allright with key and cash usage there, isn't it? Or have I missed something?
    As for the HardWare monitoring, we have some performance counters - which of them we should pay close attention for this case?

    Turning off some SIFTlevels of some SIFT keys give a bit of performance growth but it is not enough. Maybe there is a way of some kind of SIFTlevel tuning for this task?
    I think I should say again that it is a standard code so it must work fast, but it doesn't.
    Nil desperandum
  • kinekine Member Posts: 12,562
    DeSp wrote:
    I use a Client Monitor toolset. Client Monitor (Key usage) and Client Monitor (Cash Usage) show no records - so it is allright with key and cash usage there, isn't it? Or have I missed something?
    You need to collect the data through standard client monitor and after that run the Extended Client Monitor form - it will transform the data from Client monitor into the tables of this Extended CM (you can check the tables that there are some data).
    As for the HardWare monitoring, we have some performance counters - which of them we should pay close attention for this case?
    First step is to watch CPU, HDD (write and read queue), Network...
    I think I should say again that it is a standard code so it must work fast, but it doesn't.

    That it is standard code is not enough to have fast code. Navision standard code is not optimal.
    Kamil Sacek
    MVP - Dynamics NAV
    My BLOG
    NAVERTICA a.s.
  • DeSpDeSp Member Posts: 105
    You need to collect the data through standard client monitor and after that run the Extended Client Monitor form - it will transform the data from Client monitor into the tables of this Extended CM (you can check the tables that there are some data).
    What is an Extended Client Monitor? I only have here a Client Monitor.fob consisting of several objects: Client Monitor, Client Monitor (Key Usage), Client Monitor (Cache Usage), Cleint Monitor (Multi-User) and so on.
    I do collect the data through standard client monitor and then run the Client Monitor form (From the Client Monitor.fob) - it transforms the data from client monitor into the tables, then I run Client Monitor for Key and Cache usage forms - they show no records, so I make a conclusion that everything is fine with key and cache usage there.

    This code mainly calculates the flowfileds, maybe there is a way to tune the SIFT tables for better CALCFIELDS performance somehow?
    Nil desperandum
  • kinekine Member Posts: 12,562
    With the Extended Client monitor I meant the objects in the .fob about you wrote... I wanted to be sure that there are some data when you open the Client Monitor form from this fob... - there you can see the elapsed time and you can find which actions are taking most of the time. Good thing is to export the data and import them into Excel and use Pivot table to show total time for each table or action to see where is the bottleneck.
    Kamil Sacek
    MVP - Dynamics NAV
    My BLOG
    NAVERTICA a.s.
  • DeSpDeSp Member Posts: 105
    G/L Entry is the bottleneck table with the CALCSUMS action. This action's elapsed time reaches for hundreds ms - it is too slow. How can we improve it?
    Nil desperandum
  • kinekine Member Posts: 12,562
    You can enable the CALCSUMFields for the calculated fields on appropriate key. Select key which has good selectivity for the used filter you are using and on this key use SIFT for precalc. the sums (on the base table from which you are calculating the sum).
    Kamil Sacek
    MVP - Dynamics NAV
    My BLOG
    NAVERTICA a.s.
  • DeSpDeSp Member Posts: 105
    Thank you for your assistance, kine!

    Ok, here is the scope:

    This code runs from the form 12403 G/L Coressp. General Ledger when we check the Calcualte Detailed Balance checkbox (works very slow - the issue):

    CalculateBeginPeriodAmount()
    GLAccount.CALCFIELDS("Balance at Date","Add.-Currency Balance at Date","Credit Balance at Date","Debit Balance at Date");
    .......
    GLAccount.CALCFIELDS("Credit Balance at Date","Debit Balance at Date");
    .......
    

    These flowfields have the apropriate SumIndexFelds in the G/L Entry Table and a Table Filter: "G/L Account No.=FIELD(No.),G/L Account No.=FIELD(FILTER(Totaling)),Business Unit Code=FIELD(Business Unit Filter),Global Dimension 1 Code=FIELD(Global Dimension 1 Filter),Global Dimension 2 Code=FIELD(Global Dimension 2 Filter),Posting Date=FIELD(UPPERLIMIT(Date Filter)),Agreement No.=FIELD(Agreement Filter),Source Type=FIELD(Source Type Filter),Source No.=FIELD(Source No. Filter)".
    These SumIndexFields are attached to the key in the G/L Entry table: "Source Type,Source No.,G/L Account No.,Global Dimension 1 Code,Global Dimension 2 Code,Business Unit Code,Agreement No.,Posting Date".
    MaintainSQLIndex = MaintainSIFTIndex = Yes.
    SIFTLevelsToMaintain has the default values.

    So what should I do in case your last reply?
    Nil desperandum
  • kinekine Member Posts: 12,562
    Key: Source Type,Source No.,G/L Account No.,...

    If you look to this key - beginning with Option (which have only max 5 values) - it means, if you filter to this, you will take approx. 1/5 of all data from this table (and this table can have millions of entries...). Try to not maintain this index (and SIFT) on SQL and create new one instead, beginning with something what have more values (better selectivity). For example:
    Key: Source No.,Source Type,G/L Account No.,...

    Use same fields in SUMIndexFields and try it now...
    Kamil Sacek
    MVP - Dynamics NAV
    My BLOG
    NAVERTICA a.s.
  • krikikriki Member, Moderator Posts: 9,118
    How many keys do you have on the G/L Entry Table? And how many fields in each key. Sometimes it is better not to maintain those keys but create newer keys with less fields in it. SQL prefers small keys to use.
    For SIFT-fields, you can need bigger keys, but maintain only the levels you really use.

    for key
    Source No.,Source Type,G/L Account No.,...
    Better create an index
    Source No.,G/L Account No.,Source Type,...
    or
    G/L Account No.,Source No.,Source Type,...
    Check which field (Source No. or G/L Account No.) has the most values and use this as the first field.

    Try to put the fields with a high number of different values at the beginning (so options, booleans at the end). An exception are dates, these are best kept as the last field.
    Regards,Alain Krikilion
    No PM,please use the forum. || May the <SOLVED>-attribute be in your title!


  • DeSpDeSp Member Posts: 105
    Thanks for your assistance, kriki and kine.
    I've redesigned some SIFT keys in the G/L Entry table according to your adviñes - it seems it now works a bit faster.

    So is it all we can do to optimize the performance there? Maybe there are some other hacks or something? In our case we can't upgrade HW anymore and it doesn't look like we have any bottlenecks with it, so the only way is to make code run faster.
    Nil desperandum
  • kinekine Member Posts: 12,562
    There is whole documentation about the performance (Performance troubleshooting guide). And many posts on Mibuso...
    Kamil Sacek
    MVP - Dynamics NAV
    My BLOG
    NAVERTICA a.s.
  • dsatriadsatria Member Posts: 80
    kriki wrote:
    Try to put the fields with a high number of different values at the beginning (so options, booleans at the end). An exception are dates, these are best kept as the last field.
    In determining how many distinct values field XXX has, should it be relative to another field in front of it (in a key) ?
    For example:
    in my Warehouse Entry table, Variant Code alone has more distinct values than Lot No does BUT when we look at per item basis, Lot No will has more distinct values.
    In that case, which key has better selectivity: "Item No,Lot No,Variant Code" OR "Item No,Variant Code,Lot No" ??
  • kinekine Member Posts: 12,562
    Best selectivity key begins with "Lot No"... other fields are not important
    It means, in you case, "Lot No., Item No, Variant Code..." is best key...

    (By default, SQL is using just first field in index to calc the selectivity - can be changed through settings...)
    Kamil Sacek
    MVP - Dynamics NAV
    My BLOG
    NAVERTICA a.s.
  • krikikriki Member, Moderator Posts: 9,118
    dsatria wrote:
    In that case, which key has better selectivity: "Item No,Lot No,Variant Code" OR "Item No,Variant Code,Lot No" ??
    To create a good key, you need to answer some questions:
    1)How many records you have in the table?
    2)If you have ONLY a filter on 1 "Item No.", what is the average no. of records selected?
    3)If you have ONLY a filter on 1 "Lot No.", what is the average no. of records selected?
    4)If you have ONLY a filter on 1 "Variant Code", what is the average no. of records selected?
    5)Does it happen that the same "Variant Code" is used on multiple items?
    If for example you have respectively (for questions 2,3,4) 1000, 100, 300. The best would be to create a key on "Lot No." and it is not even needed to add the other fields. Scanning 100 records is nothing for SQL.
    And like Kine said, starting with "Lot No." would be a good idea because if you know the lot, you know the item. But if you know the item you don't know the lot.
    A problem with the "Lot No."-key is: in case you have ONLY a filter on "Item No.", you don't have a key starting with "Item No.", so you would need to create a key for that.
    Regards,Alain Krikilion
    No PM,please use the forum. || May the <SOLVED>-attribute be in your title!


Sign In or Register to comment.