Say you are using 4 dimensions which can roughly be considered the average, common case... and you post a shipment & invoice. It inserts 10 Item Ledger Entries and then 40 Ledger Entry Dimension. Then 10 Value Entries and another 80 Ledger Entry Dimension. A couple of G/L Entries and, guess what, 4 times a couple of Ledger Entry Dimension. 10 Posted Shipment Lines, 40 Posted Document Dimension. 10 Invoice Lines, 10 Posted Document Dimension. And so on, and so on.
Wow... this surely must be a significant bottleneck, not only that these tables have a lot more records than the others but also a prime target for blocking. I.e. if one client inserts a Value Entry and the other a Cust. Ledger Entry they won't block each other, but at the Ledger Entry Dimension they will. Wow...
Restructuring the index and turning on Clustered surely helps in reads but not sure they help in writes.
Putting them into a separate files and that file onto a separate hard disk would surely help as disk writes could be paralellized.
Another idea: each month restoring an SQL backup to serve as an analysis database, updating the analysis views and deleting the Posted Document Dimension, Ledger Entry Dimension, Production Document Dimension tables (where status=finished) from live could also help.
Also I'm not sure whether MS SQL supports updateable views if it does, what if you the dims for every kind of ledger into a separate table and replace the Ledger Entry Dim table with an updateable view?
What do you think?
0
Comments
Mainly if you imagine the "infinite dimensions" what is that sales mans are selling...
MVP - Dynamics NAV
My BLOG
NAVERTICA a.s.
You can get rid of the these tables.
For analysis. you can still build analysis views based on this Unique Identifier.
For example
You have 4 dimensions.
In GL you'll have a field called Unique Identifier. and it would be populated based on dimension combination. If it doesn't exists, Insert a new record.
Independent Consultant/Developer
blog: https://dynamicsuser.net/nav/b/ara3n
Than for reporting I'd think about something more robust than analysis views altogether. Don't even think about making a copy of the database every month, use SQL Server tools for getting the right information.
RIS Plus, LLC
They also made it much more difficult to fix dimension problems.
NAV already has a codeunit in place for posting custom fields - all you have to do as a developer is add the fields to the codeunit functions.
http://mibuso.com/blogs/davidmachanick/
Posted Doc Dims are only used for:
- Get Shipment/Receipt Lines
- Copy Document
- by customizing these to pick the dims up from master data they can be commented out
- Ledger Entry Dims for Customer/Vendor incl. Detailed ones are not used at all, can be commented out
- for Item Ledger they are only used for Phys. Inv. - that's not important, can be commented out
- for Value Entries they are important (posting to the G/L and also Analysis Reports)
- also they are important for the G/L (Accounting Schedules)
but even for Value Entries posting the two Global Dims can be taken out and instead a batch job written to populate these (for example, during the night, with appserver, or running it manually before going home.
Heresy: it could be an SQL job. Yes, I know, not recommended. But what can go wrong with copying fields from one table to another one that does not have SIFT or trigger coding?)
Denster,
you are exactly right, the main benefit of Dimensions is Analysis by Dimensions which can be done much better via OLAP, and Accounting Schedules which can just rather be set up as separate G/L accounts (i.e. National Bicycle Sales Account, EU Bicycle Accessories Sales Account etc.)
however this decision must be made at project planning phase, clients already live expect cheap, possibly free solutions to performance problems so I think the above can work, it's about less than a day's work and can be fairly easily moved from one installation to another.
"NAV already has a codeunit in place for posting custom fields" - I must have missed it: which one?
Otherwise, agreed but it means a different G/L structure and SQL Analysis Services so it better suits new projects than existing clients.
CU10201 - Transfer Custom Fields?
http://www.BiloBeauty.com
http://www.autismspeaks.org
IMHO performance impact is not so big as it looks like, at least performance loss due to many SQL inserts is not a big problems, mainly because:
1. posted document and ledger entry dimension tables have relatively small records (many records fit on single db page),
2. there are no SIFTs defined on any posted document neither ledger entry dimension tables
Of course it might be the problem if the tables are not properly maintained at SQL level (I've seen once 15sec insert to T355 in one of the installation !!), but virtually everything might cause problems without proper maintenance.
Normal database housekeeping (rebuilding indexes with customized fill factor) is enough to keep SQL INSERT performance on reasonable level even if there are quite a few dozens milions of records there.
Regards,
Slawek
Dynamics NAV, MS SQL Server, Wherescape RED;
PRINCE2 Practitioner - License GR657010572SG
GDPR Certified Data Protection Officer - PECB License DPCDPO1025070-2018-03
Lucky us in the USA...
http://mibuso.com/blogs/davidmachanick/
BTW - Did you notice that in practice there is 10 dimensions limit ? You can assign more that 10 default dimensions, but only 10 are propagated.
Dynamics NAV, MS SQL Server, Wherescape RED;
PRINCE2 Practitioner - License GR657010572SG
GDPR Certified Data Protection Officer - PECB License DPCDPO1025070-2018-03
the issue is account schedule report that use analysis views, and writting those reports without hardcoding won't be easy.
The solution i've written is the least intrusive and fairly easy to implement.
Independent Consultant/Developer
blog: https://dynamicsuser.net/nav/b/ara3n