Poor Performance when reading a SQL View via a NAV report

rsaritzky
Member Posts: 469
Hi,
I'm trying to optimize a physical inventory generation in NAV2009. Client has 3,000,000+ Item Ledger entries, 65,000+ unique items and 50+ Inventory locations.
I have built a version of the "Calculate Inventory" process that uses a view that I created. The view summarizes the quantities (basically provides a Qty on Hand) and provides the most recent date in the Item Ledger Entries. It also groups by location so I can run the report by a group of locations. That way, if an item has had the most entry, say a year ago, and has a QtyOnHand of 0, then I can skip the rest of the processing. I do need both the most recent posting date and the Qty On Hand in order to filter the items I wish to process.
Here's my view:
The view runs fast enough in SQL, but if I use the view in a NAV report and run the Performance Monitor against it, each "NEXT" on the view table takes 90% of the processing time (e.g. 900 milliseconds per location/item vs. approx. 80 milleseconds for the rest of the processing, e.g. creating an item journal line), For any single location, the process has to read through 3000-20000+ Items.
As a result, the "Calculate" process runs way longer than I want it to.
I've tried FIND('-') along with NEXT in my code (The record reads for this table are in an "Integer" dataitem with manually-coded "NEXT" commands.) I've also tried FINDSET. There's no difference in performance. Currently, my Caching Record Set is set to 50.
If I can optimize the way NAV reads this view, I potentially can save 80-90% of the processing time.
Anyone have any ideas?
Thx
I'm trying to optimize a physical inventory generation in NAV2009. Client has 3,000,000+ Item Ledger entries, 65,000+ unique items and 50+ Inventory locations.
I have built a version of the "Calculate Inventory" process that uses a view that I created. The view summarizes the quantities (basically provides a Qty on Hand) and provides the most recent date in the Item Ledger Entries. It also groups by location so I can run the report by a group of locations. That way, if an item has had the most entry, say a year ago, and has a QtyOnHand of 0, then I can skip the rest of the processing. I do need both the most recent posting date and the Qty On Hand in order to filter the items I wish to process.
Here's my view:
Create view [dbo].[CRONUS USA, Inc_$Physical Inventory Select View] as select "Item No_", "Location Code", MAX([Posting Date]) as [Last Posting Date], sum(Quantity) as QtyOnHand from [CRONUS USA, Inc_$Item Ledger Entry] group by [Location Code], [Item No_]
The view runs fast enough in SQL, but if I use the view in a NAV report and run the Performance Monitor against it, each "NEXT" on the view table takes 90% of the processing time (e.g. 900 milliseconds per location/item vs. approx. 80 milleseconds for the rest of the processing, e.g. creating an item journal line), For any single location, the process has to read through 3000-20000+ Items.
As a result, the "Calculate" process runs way longer than I want it to.
I've tried FIND('-') along with NEXT in my code (The record reads for this table are in an "Integer" dataitem with manually-coded "NEXT" commands.) I've also tried FINDSET. There's no difference in performance. Currently, my Caching Record Set is set to 50.
If I can optimize the way NAV reads this view, I potentially can save 80-90% of the processing time.
Anyone have any ideas?
Thx
Ron
0
Comments
-
Use ADO instead and read the data through that instead of using view. There are many examples here. you can just run a select statement from NAV and then loop through the recordset.
As for the view it can be improved in performance as well but the first option the fastest way.Create view [dbo].[CRONUS USA, Inc_$Physical Inventory Select View] as select "Item No_", "Location Code", MAX([Posting Date]) as [Last Posting Date], sum("Remaining Quantity") as QtyOnHand from [CRONUS USA, Inc_$Item Ledger Entry] Where Open = 1 group by [Location Code], [Item No_]
0 -
ara3n wrote:Use ADO instead and read the data through that instead of using view. There are many examples here. you can just run a select statement from NAV and then loop through the recordset.
As for the view it can be improved in performance as well but the first option the fastest way.
I've thought about converting to ADO - have done it before.
You also said "As for the view it can be improved in performance as well but the first option the fastest way."
Any specific ways to improve it?
ThxRon0 -
The code above it that i posted0
-
But the optimized code will not show an Item/Location combination unless it has open entries, which seems to contradict the requirement. Or am I missing something? :-kRegards
Peter0 -
If qty is zero he skips those entries.0
-
ara3n wrote:If qty is zero he skips those entries.rsaritzky wrote:That way, if an item has had the most entry, say a year ago, and has a QtyOnHand of 0, then I can skip the rest of the processing. I do need both the most recent posting date and the Qty On Hand in order to filter the items I wish to process.Regards
Peter0 -
It turns out that I can't filter on Open = 1, because I DO need to include entries where qty on hand = 0 IF the last posting date is after a certain date. If I filter on Open = 1, then an item with recent Item Ledger Entries won't have the correct Last Posting Date.
Thanks for everyone's opinions, though. I appreciate the feedback.Ron0 -
I guess the problem is the cursor-type used by NAV, which we can't control.
But you could create an indexed view, but that would add a bit of workload to every single update of the Item Ledger Entry table.
You could also change your view to use one of NAV's own indexed views, i.e. ..$Item Ledger Entry$VSIFT$3.
I guess it might help a bit, but I still think you will get the best performance using ADO.
Please let us know if it helpsRegards
Peter0 -
I got impatient, so I had to try using the existing NAV indexed view in my test DB with only 989875 Item Ledger Entries. 8)
I get the same 25896 records back, but it reads only 1/10 (5372 compared to 54473) and the duration is 1/3 (951 compared to 3105).
This is without doing anything to prevent cache reads, and made in SSMS and stats from the SQL Profiler.
My view:Create view [dbo].[AA-TestPDJ] as select "Item No_", "Location Code", MAX([Posting Date]) as [Last Posting Date], sum(SUM$Quantity) as QtyOnHand from [dbo].[MyCompany$Item Ledger Entry$VSIFT$3] group by [Location Code], [Item No_]
PS: I'm using NAV2009R2 - you might find an even better index in NAV2013.Regards
Peter0 -
Thanks for the idea! I started down the path of an indexed view, but haven't built one before and couldn't find an example that seemed to work - all the help docs say you can't use the MAX function in an indexed view. But I'll try your idea!Ron0
-
pdj wrote:I got impatient, so I had to try using the existing NAV indexed view in my test DB with only 989875 Item Ledger Entries. 8)
I get the same 25896 records back, but it reads only 1/10 (5372 compared to 54473) and the duration is 1/3 (951 compared to 3105).
This is without doing anything to prevent cache reads, and made in SSMS and stats from the SQL Profiler.
My view:Create view [dbo].[AA-TestPDJ] as select "Item No_", "Location Code", MAX([Posting Date]) as [Last Posting Date], sum(SUM$Quantity) as QtyOnHand from [dbo].[MyCompany$Item Ledger Entry$VSIFT$3] group by [Location Code], [Item No_]
Well, the way NAV handles SQL cursors apparently strikes again - When running this through report code where the reads are handled by NEXT's, unfortunately, performance is exactly the same. Like you, I was encouraged by the speed of the return of data through the view directly from SQL (I did my tests in Query Analyzer). But if you run the same query through NAV and you run the NAV Performance Monitor, you'll see the same results. But a great idea!
ThanksRon0 -
I just tried making linked tables in NAV, and going through the views with a simple find('-') until next=0 loop.
My suggestion takes 1,3s and your suggestion takes 3,5s. (measured by currentdatetime in NAV)
It is strange you don't get a different performance with the two views. Could you show your code from the report triggers?
What version are you using, and is it RTC or Classic?Regards
Peter0
Categories
- All Categories
- 73 General
- 73 Announcements
- 66.6K Microsoft Dynamics NAV
- 18.7K NAV Three Tier
- 38.4K NAV/Navision Classic Client
- 3.6K Navision Attain
- 2.4K Navision Financials
- 116 Navision DOS
- 851 Navision e-Commerce
- 1K NAV Tips & Tricks
- 772 NAV Dutch speaking only
- 617 NAV Courses, Exams & Certification
- 2K Microsoft Dynamics-Other
- 1.5K Dynamics AX
- 320 Dynamics CRM
- 111 Dynamics GP
- 10 Dynamics SL
- 1.5K Other
- 990 SQL General
- 383 SQL Performance
- 34 SQL Tips & Tricks
- 35 Design Patterns (General & Best Practices)
- 1 Architectural Patterns
- 10 Design Patterns
- 5 Implementation Patterns
- 53 3rd Party Products, Services & Events
- 1.6K General
- 1.1K General Chat
- 1.6K Website
- 83 Testing
- 1.2K Download section
- 23 How Tos section
- 252 Feedback
- 12 NAV TechDays 2013 Sessions
- 13 NAV TechDays 2012 Sessions