General Ledger - Calculate Detailed Balance works too slow

DeSp
Member Posts: 105
We use Navision 3.70A SQL Server Option.
On the G/L Corresp. General Ledger form we have a checkbox called Calculate Detailed Balance. This calculation runs very very slow - too slow. It is a standard code and we haven't done any customizations there.
I tried to play with SIFTLevelsToMaintain property of the keys in G/L Entry table but with no any significant result.
So what is the cure? How can we boost the performance there?
Any suggestions would be very much appreciated.
Thanks in advance.
On the G/L Corresp. General Ledger form we have a checkbox called Calculate Detailed Balance. This calculation runs very very slow - too slow. It is a standard code and we haven't done any customizations there.
I tried to play with SIFTLevelsToMaintain property of the keys in G/L Entry table but with no any significant result.
So what is the cure? How can we boost the performance there?
Any suggestions would be very much appreciated.
Thanks in advance.
Nil desperandum
0
Comments
-
1) You can use Client Monitor and the Extended client monitor objects for finding what is too slow. (bad used key, cache is not used correctly, full scans etc.)
2) You can monitor HW to see if there are some bottlenecks0 -
I use a Client Monitor toolset. Client Monitor (Key usage) and Client Monitor (Cash Usage) show no records - so it is allright with key and cash usage there, isn't it? Or have I missed something?
As for the HardWare monitoring, we have some performance counters - which of them we should pay close attention for this case?
Turning off some SIFTlevels of some SIFT keys give a bit of performance growth but it is not enough. Maybe there is a way of some kind of SIFTlevel tuning for this task?
I think I should say again that it is a standard code so it must work fast, but it doesn't.Nil desperandum0 -
DeSp wrote:I use a Client Monitor toolset. Client Monitor (Key usage) and Client Monitor (Cash Usage) show no records - so it is allright with key and cash usage there, isn't it? Or have I missed something?As for the HardWare monitoring, we have some performance counters - which of them we should pay close attention for this case?I think I should say again that it is a standard code so it must work fast, but it doesn't.
That it is standard code is not enough to have fast code. Navision standard code is not optimal.0 -
You need to collect the data through standard client monitor and after that run the Extended Client Monitor form - it will transform the data from Client monitor into the tables of this Extended CM (you can check the tables that there are some data).
I do collect the data through standard client monitor and then run the Client Monitor form (From the Client Monitor.fob) - it transforms the data from client monitor into the tables, then I run Client Monitor for Key and Cache usage forms - they show no records, so I make a conclusion that everything is fine with key and cache usage there.
This code mainly calculates the flowfileds, maybe there is a way to tune the SIFT tables for better CALCFIELDS performance somehow?Nil desperandum0 -
With the Extended Client monitor I meant the objects in the .fob about you wrote... I wanted to be sure that there are some data when you open the Client Monitor form from this fob... - there you can see the elapsed time and you can find which actions are taking most of the time. Good thing is to export the data and import them into Excel and use Pivot table to show total time for each table or action to see where is the bottleneck.0
-
G/L Entry is the bottleneck table with the CALCSUMS action. This action's elapsed time reaches for hundreds ms - it is too slow. How can we improve it?Nil desperandum0
-
You can enable the CALCSUMFields for the calculated fields on appropriate key. Select key which has good selectivity for the used filter you are using and on this key use SIFT for precalc. the sums (on the base table from which you are calculating the sum).0
-
Thank you for your assistance, kine!
Ok, here is the scope:
This code runs from the form 12403 G/L Coressp. General Ledger when we check the Calcualte Detailed Balance checkbox (works very slow - the issue):
CalculateBeginPeriodAmount()GLAccount.CALCFIELDS("Balance at Date","Add.-Currency Balance at Date","Credit Balance at Date","Debit Balance at Date"); ....... GLAccount.CALCFIELDS("Credit Balance at Date","Debit Balance at Date"); .......
These flowfields have the apropriate SumIndexFelds in the G/L Entry Table and a Table Filter: "G/L Account No.=FIELD(No.),G/L Account No.=FIELD(FILTER(Totaling)),Business Unit Code=FIELD(Business Unit Filter),Global Dimension 1 Code=FIELD(Global Dimension 1 Filter),Global Dimension 2 Code=FIELD(Global Dimension 2 Filter),Posting Date=FIELD(UPPERLIMIT(Date Filter)),Agreement No.=FIELD(Agreement Filter),Source Type=FIELD(Source Type Filter),Source No.=FIELD(Source No. Filter)".
These SumIndexFields are attached to the key in the G/L Entry table: "Source Type,Source No.,G/L Account No.,Global Dimension 1 Code,Global Dimension 2 Code,Business Unit Code,Agreement No.,Posting Date".
MaintainSQLIndex = MaintainSIFTIndex = Yes.
SIFTLevelsToMaintain has the default values.
So what should I do in case your last reply?Nil desperandum0 -
Key: Source Type,Source No.,G/L Account No.,...
If you look to this key - beginning with Option (which have only max 5 values) - it means, if you filter to this, you will take approx. 1/5 of all data from this table (and this table can have millions of entries...). Try to not maintain this index (and SIFT) on SQL and create new one instead, beginning with something what have more values (better selectivity). For example:Key: Source No.,Source Type,G/L Account No.,...
Use same fields in SUMIndexFields and try it now...0 -
How many keys do you have on the G/L Entry Table? And how many fields in each key. Sometimes it is better not to maintain those keys but create newer keys with less fields in it. SQL prefers small keys to use.
For SIFT-fields, you can need bigger keys, but maintain only the levels you really use.
for keySource No.,Source Type,G/L Account No.,...Source No.,G/L Account No.,Source Type,...G/L Account No.,Source No.,Source Type,...
Try to put the fields with a high number of different values at the beginning (so options, booleans at the end). An exception are dates, these are best kept as the last field.Regards,Alain Krikilion
No PM,please use the forum. || May the <SOLVED>-attribute be in your title!0 -
Thanks for your assistance, kriki and kine.
I've redesigned some SIFT keys in the G/L Entry table according to your adviñes - it seems it now works a bit faster.
So is it all we can do to optimize the performance there? Maybe there are some other hacks or something? In our case we can't upgrade HW anymore and it doesn't look like we have any bottlenecks with it, so the only way is to make code run faster.Nil desperandum0 -
There is whole documentation about the performance (Performance troubleshooting guide). And many posts on Mibuso...0
-
kriki wrote:Try to put the fields with a high number of different values at the beginning (so options, booleans at the end). An exception are dates, these are best kept as the last field.
For example:
in my Warehouse Entry table, Variant Code alone has more distinct values than Lot No does BUT when we look at per item basis, Lot No will has more distinct values.
In that case, which key has better selectivity: "Item No,Lot No,Variant Code" OR "Item No,Variant Code,Lot No" ??0 -
Best selectivity key begins with "Lot No"... other fields are not important
It means, in you case, "Lot No., Item No, Variant Code..." is best key...
(By default, SQL is using just first field in index to calc the selectivity - can be changed through settings...)0 -
dsatria wrote:In that case, which key has better selectivity: "Item No,Lot No,Variant Code" OR "Item No,Variant Code,Lot No" ??
1)How many records you have in the table?
2)If you have ONLY a filter on 1 "Item No.", what is the average no. of records selected?
3)If you have ONLY a filter on 1 "Lot No.", what is the average no. of records selected?
4)If you have ONLY a filter on 1 "Variant Code", what is the average no. of records selected?
5)Does it happen that the same "Variant Code" is used on multiple items?
If for example you have respectively (for questions 2,3,4) 1000, 100, 300. The best would be to create a key on "Lot No." and it is not even needed to add the other fields. Scanning 100 records is nothing for SQL.
And like Kine said, starting with "Lot No." would be a good idea because if you know the lot, you know the item. But if you know the item you don't know the lot.
A problem with the "Lot No."-key is: in case you have ONLY a filter on "Item No.", you don't have a key starting with "Item No.", so you would need to create a key for that.Regards,Alain Krikilion
No PM,please use the forum. || May the <SOLVED>-attribute be in your title!0
Categories
- All Categories
- 73 General
- 73 Announcements
- 66.6K Microsoft Dynamics NAV
- 18.7K NAV Three Tier
- 38.4K NAV/Navision Classic Client
- 3.6K Navision Attain
- 2.4K Navision Financials
- 116 Navision DOS
- 851 Navision e-Commerce
- 1K NAV Tips & Tricks
- 772 NAV Dutch speaking only
- 617 NAV Courses, Exams & Certification
- 2K Microsoft Dynamics-Other
- 1.5K Dynamics AX
- 321 Dynamics CRM
- 111 Dynamics GP
- 10 Dynamics SL
- 1.5K Other
- 990 SQL General
- 383 SQL Performance
- 34 SQL Tips & Tricks
- 35 Design Patterns (General & Best Practices)
- 1 Architectural Patterns
- 10 Design Patterns
- 5 Implementation Patterns
- 53 3rd Party Products, Services & Events
- 1.6K General
- 1.1K General Chat
- 1.6K Website
- 83 Testing
- 1.2K Download section
- 23 How Tos section
- 252 Feedback
- 12 NAV TechDays 2013 Sessions
- 13 NAV TechDays 2012 Sessions