Hello, we have 700 GB Database size of Dynamics Nav 2009 R2 using SQL server of course.
80+ millions of records in "Value entry" and "Item Ledger entry" and related tables.
My Problem is when Running the "Adjust Cost Item Entries Batch" it is taking too long to finish (16 Hrs) and blocking everything on the system . So we had to run this during weekend.
Question 1: any idea to make these batch run faster??
Question 2: Anyone reached this size of database in Dynamics NAV??
Thank you
0
Comments
Without more details I can't really provide you any specific solutions, other than to search the performance related postings on this site and then post more specific details and questions on your situation. From my experience with a large database (while not quite 700 GB), that 16 hours to run Adjust Cost just seems way out of whack. But again, that's just a general observation based only on known information.
The details of the large database to which I refer:
Size: 520 GB used (645 GB total data file size)
Item Ledger Entry: 11 million records
Value Entry: 137 million records
GL Entry: 264 million records
Ledger Entry Dimensions: 1.7 Billion records
Typical Adjust Cost runtime = 20 minutes (run daily)
In terms of what's been done, I'd have to say nothing out of the ordinary. By that I mean nothing that is not discussed on this site many times, or in sources referenced on this site. I'm sure this is not the only example of a large database out there.
Are you using manufacturing?
Independent Consultant/Developer
blog: https://dynamicsuser.net/nav/b/ara3n
I have a database with "only" 40+ millions of records in "Value Entry". Over the years I have had to do a few changes and addons to the inventory adjustment.
One for example is a batch report based on the Item table
Some others are
Walk the application path and check the adjustment Reopen all application Application batch Repair cost amount by walking the application path where the standard adjustment does not do the job
Gunnar Gestsson
Microsoft Certified IT Professional
Dynamics NAV MVP
http://www.dynamics.is
http://Objects4NAV.com
My DB is 551 GB used and DB file size 703GB
Item Ledger Entry: 55 million records
Value Entry: 86 million records
GL Entry: 34 million records
Ledger Entry Dimensions: 303 Million records
Item Table 98000 where only 40000 are active
using FIFO Costing method
would you plz share with me the hardware specifications you are using RAM and how many disks and files for SQL , how many CPU...
thxxxx
x
Not using SKU and not Manufacturing
using LS Retail Addon where the Sales Entries are coming from...
Version of SQL Server
RAM assigned to SQL Server
disk drive configuration
Have you tuned your indexes?
Have you run resource monitor while this process is running to see which resources are most heavily used?
Are you running this task on the SQL Server?
If not, what is your network traffic load.
How many active items do you have?
http://mibuso.com/blogs/davidmachanick/
Processors: (4) Quad-Core AMD 3.10 Ghz
Memory: 192 GB (server supports 2 SQL Instances. SQL Max Memory is set to 90 GB on each)
Windows O/S: WIndows 2008 R2 - Enterprise
SQL: SQL 2008 Enterprise
Disk Arrays (all expect are located on the SAN. I don't list the O/S array which is on the server itself)
Array 1: 4x146 GB RAID 10 SQL System databases
Array 2: 14x146 GB RAID 10 SQL Data 1
Array 3: 14x146 GB RAID 10 SQL Data 2
Array 4: 14x146 GB RAID 10 SQL Data 3
Array 5: 8x146 GB RAID 10 SQL Transaction log (live DB only)
Array 6: 4x146 GB RAID 10 SQL Transaction log (others)
Array 7: 5x450 GB RAID 5 Backups
The NAV database has 3 files in "Data File Group 1" with each on a separate array (2,3,4). Those arrays are also hosting databases on the other SQL instance. Although those see light to moderate use.
There is also a second identical server to which the live DB is mirrored.
Don't look at this and think hardware is the whole solution. It's just part of the puzzle. And yes, this config might be considered a bit extreme in some ways. But that's another topic. We've done some index tuning, run index and statistics updates daily. We pay close attention to the performance impact when designing modifications. And we have used ADO in a few areas to help boost performance.
ArcherPoint, Inc http://www.archerpoint.com
Blog: http://www.navisionworld.com
skkulla
One problem I ran into with NAV and SQL Server is where it turned out I had a filter that covered 254,000 rows.
It built a huge tempdb, and in this case it never finished, I had to kill the SQL process. I don't think it got beyond the initial find('-').
That is why you should look at all the performance tools and see what is going on.
It could be something as simple as a bad index and SQL Server is performing table scans for each entry processed.
http://mibuso.com/blogs/davidmachanick/
Speaking of bad indexes, one of the places we tuned the system I've mentioned was the part where it retrieves the "Value Entries" for a specific ILE to determine the current posted cost and calculate the adjustment. The indexes where just wrong for such a large table.
hardware is:
64gb upgrading soon to 126
5 disks and 5 files files FOR SQL
RUNNING ON sqlserver machine
40000 active items
can please explain how to use : resource monitor while this process is running to see which resources are most heavily used?
1 Array of 5 disks RAID5 300gb each 15K
1 Array of 4 disks Raid5 600GB each 10k
OS system diffrent HardDrive
You can also run perfmon.
Jorg Stryk has a book called "The NAV/SQL Performance Guide" - you can see if that is still available.
There are a variety of SQL monitoring and performance tools that you can find with the help of Google.
http://mibuso.com/blogs/davidmachanick/
This sounds incredibly difficult. Adjust Costs is very difficult to understand - even if you understand FIFO logic and the whole cost forwarding principle - 90% of people I know are baffled by the whole idea, they are used to doing FIFO backward at year end, not forward - there are so many exceptions and special cases in that codeunit that I don't think it is really possible for anyone to fully understand it.
Frankly in this case I would just go with Standard Costing and making one backwards-FIFO report at year end (i.e. take current stock, match it backwards in time to purchase invoices, easy) to determine closing stock value and book that manually...
ok now i have the following:
OS on separate Disks
and 3 disks x2 on RAID10
Array of 5 Disks RAID 5 shared with other BI application
planning to add Array 4x 2 disks on Raid 5
yep
I am increasing the Vdisks and put the SQL log file into a dedicated one. It is much better performance now.
still to work on the keys...
This is the first time you've mentioned this was a virtual server. Is the hardware you mentioned, the host?
its is not virtual server!!
Vdisk is naming used in Raid setup: other naming is DiskArray
I would recommend getting in contact with LS consultants to how to change settings so that your ledgers are more summed up, and also writing batch jobs (and/or checking the standard date compress batch jobs) to sum them up a bit.
It's simply not normal to have such sizes in my opinion. I find our own 12GB already too much, so that I can't just copy-paste entries into Excel and make an ad-hoc inventory value report, it takes too long. This is not good.
actually your implementation is more common than you may think. We have a couple of LS Retails systems in the Gulf that are larger than yours (~1Tb) there is a lot that can be done, faster hardware, tuning etc. But in the end it comes down the the sheer volume of data that needs to be processed to Adjust Costs. That means either modifying the Adjust Cost routine, or reducing (by compression) the number of records processed.
But you can feel safe in knowing that you are not unique, and there are others out there with similar LS Retail systems of about the same size as yours.
We finished implementations of our solution for company with ~2TB database few month ago. >200M item entries and growing 200K-500K entries a day. Do not worry - NAV can handle much bigger databases.
http://www.microsoft.com/en-us/dynamics/customer-success-stories-detail.aspx?casestudyid=710000002119
Try to understand what item takes long time to run. NAV have bug in cost calculation that causes to recalculate all entries for item in some situations (this was fixed in 2013 R2). If you have some commonly used items with many entries you will see that some item calculates longer then others.
Adaptive Business Solutions
http://justjewelrysoftware.com
That is big relief to know that there are other situation like ours. really
regarding the Fix of cost calculation in 2013 R2, is it in the Adj. cost calculation routine batch? is it possible to adopt it in 2009 R2 version?
Nav, T-SQL.
is RAID 0+1 array of 8 disks (4x2) is considered good? or not enough?
so i can start building my arrays with such profile
David, it's been 7 years since I touched Landsteinar or LS Retail, but even back then the whole idea was while anyone could make a nice touchscreen form that posts customer and item ledger entries and call it a POS solution, the whole point of a _real_ POS solution is that it has its own entry tables, and then the item and customer entries are generated from it by a - if remember, configurable - compression such as 1 entry per item per cassa per day. I.e. avoiding making thesse standard tables huge. One could configure the retail entry tables to a different hard disk or something and then for all practical purposes, from the viewpoint of the standard tables, have a small system. This was the general idea even then...
WORKING FINE, AFTER ADDING DISKS AND FILES AND REMOVED SOME INDEXES.
BUT BECAUSE OF OUR BUSINESS EXPANSION AND WORKING SHIFT OF 24HRS, I CANNOT FIND ENOUGH TIME GAP TO RUN THE ADJUST COST BATCH (CURRENLTY RUNNING 3 HRS PER DAY, AND 10 HRS IN WEEKEND TO COVER ALL ITEM RANGE)
WE HAVE NOW 4 YEARS OF DATA, MY PLAN NOW IS TO START IN A NEW FRESH DATABASE...