The out of the box Aged Receivables Report 10040 takes almost 1.5 hours to run in both RTC and Classic version, aged as of TODAY, age by due date. We have 6900 customers with balances and 2.3 million customer ledger records. The 2.3 million records are about 2 years of history, most closed entries but obviously current open invoices and credits. The Simple Aging Report 109 runs in about 2 minutes but we still need to run the detail so 10040 is required. I am told we are running on a 98GB memory server with 2 2.136GHZ processors each with 16GB memory of core. I am no expert on the server specs but was told by my IT guy that it is a monster. Are there any fixes or recommendations to make 10040 run faster. We have not even gone live yet and are just keeping the system up to date with converted history for testing purposes. All history is loaded and posted as normal sales invoices and credits and payments also loaded and posted through a cash receipts journal. We fear what will happen when we keep adding months of date.
Ed
0
Comments
If most of your customers do not have an outstanding balance every report cycle, you can speed up the report by using flowfields to see if the customer has a balance and bypassing the customers that have a zero balance.
The downside is that if a customer has offsetting debits and credits, they will not appear on the report, but there are other ways to check for that.
This could make the report run in quarter of the time, with very little development work.
If that is still not good enough for you, you can do the report using sql instead - but not as easy.
http://mibuso.com/blogs/davidmachanick/
I have been working with 10040 for 10 years from 2.6 to 2009R2. I am used to maybe 50,000,000 item ledger and value entry records but this is my first installation with 2,000,000 customer ledger records and the detailed cusotmer ledger to go with them. If we have to we can live without the detail and base something off the Simple Aging Report. We will just add some more buckets and customize the date range for the buckets.
Since you are inserting all the customer ledger entries with remaining balances, this will speed it up some.
You could also run this report on the server since it is reading so many records.
It reads every single customer ledger entry record so it can do a calcfields on the remaining amount after applying the date filter. So a workaround has to take that into account.
You could do a filter on closed at date to eliminate the older entries.
http://mibuso.com/blogs/davidmachanick/
I don't see this help in any significant way. If at all. What's your thinking here? Maybe I'm not seeing something.
The temp tables are local to each user, so I cannot say I am sure reducing keys will have an impact on them. Possibly reducing record size in the cloned table would have a bigger impact.
My timings on 5.0 were a speed up of a factor of 4.
Since this is a newer release that uses managed views, it may require some different tweaks.
http://mibuso.com/blogs/davidmachanick/
I found it increased performance.
With the service tier it gets written to the service tier's NAV temp directory.
It is a simple change and worth the effort when there is a big performance problem.
With large customers like Walmart, the temp table should be writing to disk.
If the NAV temp directory is set a SSD drive, that may increase performance.
http://mibuso.com/blogs/davidmachanick/
AP Commerce, Inc. = where I work
Getting Started with Dynamics NAV 2013 Application Development = my book
Implementing Microsoft Dynamics NAV - 3rd Edition = my 2nd book
It is still possible to backdate this report with if you incorporate the Detailed Cust. Ledger Entry table.
Honestly, I don't know why Microsoft wouldn't incorporate the use of the Detailed Cust. Ledger instead of using the temp table method. Seems to be a lot faster and more logical.
AP Commerce, Inc. = where I work
Getting Started with Dynamics NAV 2013 Application Development = my book
Implementing Microsoft Dynamics NAV - 3rd Edition = my 2nd book
There was a post a few years ago about writing Excel by row instead of by cell.
http://www.mibuso.com/forum/viewtopic.php?t=26657
If you want to speed up the write to Excel, you could try this out.
On the aging, if NAV updated the closed date for all ledger entries, you would be able to use that. If you are ambitious, you could add a mod to change the closed date for the closing entries as well. Or if you don't want to mess with standard NAV logic, add another field to be used for this purpose.
http://mibuso.com/blogs/davidmachanick/
Apparently the temp table has a limit of 2GB, which means it could all stay in memory if your workstation has enough RAM. (I don't know how sophisticated the temp table logic in NAV is, but I hope it would use all available memory before writing to disk.)
I had one program that made repeated use of a temp table. Moving it from my reasonably high performance Windows 8 laptop with 8GB RAM to my much higher performance desktop with 32GB RAM with a SSD O/S drive and the SQL database on a 10K drive (vs laptop 7.5K), my performance tripled.
I then tried running a large detailed A/R Aging report that used to take a couple of hours in an old environment, and it took 20 minutes, and my original speed improvement code made a very minor difference to the time when I ran my "fast" version.
Now, in my environment everything is local - no network traffic issues, and no competition from other users.
It would be interesting to see the times for running your huge A/R report directly on the monster SQL Server and also on a monster PC running NAV retrieving data from your SQL Server.
http://mibuso.com/blogs/davidmachanick/