IF FNTCheckTempTable THEN BEGIN CLEAR(TBActPatientTemp); CLEAR(TBSuspPatientTemp); TBActPatientTemp.DELETEALL; TBSuspPatientTemp.DELETEALL; TBPatient.SETRANGE("Customer Type",TBPatient."Customer Type"::Patient); TBPatient.SETFILTER("No.",'..120000'); TMTime := TIME; //BEFORE IF TBPatient.FINDSET THEN BEGIN TBContactRC.RESET; TBSystemPatient.RESET; TBPatientTherapy.RESET; TBPatientSuspension.RESET; TBContactRC.SETFILTER("Starting Date",'..%1',TBFlyWheelSummTemp."Analysis Date"); TBContactRC.SETFILTER("Ending Date",'%1|%2..',0D,TBFlyWheelSummTemp."Analysis Date"); TBContactRC.SETRANGE("Responsibility Center",TBFlyWheelSummTemp."Responsibility Center"); TBSystemPatient.SETFILTER("Starting Date",'..%1',TBFlyWheelSummTemp."Analysis Date"); TBSystemPatient.SETFILTER("Ending Date",'%1|%2..',0D,TBFlyWheelSummTemp."Analysis Date"); TBPatientTherapy.SETFILTER("Starting Date",'..%1',TBFlyWheelSummTemp."Analysis Date"); TBPatientTherapy.SETFILTER("Ending Date",'%1|%2..',0D,TBFlyWheelSummTemp."Analysis Date"); TBPatientSuspension.SETFILTER("Starting Date",'..%1',TBFlyWheelSummTemp."Analysis Date"); TBPatientSuspension.SETFILTER("Ending Date",'%1|%2..',0D,TBFlyWheelSummTemp."Analysis Date"); REPEAT TBContactRC.SETRANGE("Contact No.",TBPatient."No."); TBSystemPatient.SETRANGE("Contact No.",TBPatient."No."); TBPatientTherapy.SETRANGE(Code,TBPatient."No."); TBPatientSuspension.SETRANGE(Code,TBPatient."No."); IF NOT TBContactRC.ISEMPTY THEN BEGIN //***1 IF NOT TBSystemPatient.ISEMPTY THEN BEGIN //***2 IF NOT TBPatientTherapy.ISEMPTY THEN BEGIN //***3 IF TBPatient."Contact Status Card" = TBPatient."Contact Status Card"::Closed THEN BEGIN IF NOT TBPatientSuspension.ISEMPTY THEN BEGIN CLEAR(TBActPatientTemp); TBActPatientTemp := TBPatient; TBActPatientTemp.INSERT; END ELSE BEGIN CLEAR(TBSuspPatientTemp); TBSuspPatientTemp := TBPatient; TBSuspPatientTemp.INSERT; END; END; END; END; END; UNTIL TBPatient.NEXT = 0; END; ERROR(FORMAT(TIME - TMTime)); //AFTER END;the patients (the table being looped) are a lot, so i couldn't find a slow query through client monitor, because they're simply too much queries with short execution time.
Answers
How many entries? Maybe you should replace FINDSET with FIND('-').
I am taking a guess that you are on SQL Server... In that case I don't know if executing it many times is helpful for comparison. I think you need to clear the SQL cache before re-executing the process in order to compare the results.
anyway, there are A lot of records, find('-') should be used...i already tried it just after posting but without results...even three seconds slower
i can't clear the sql cache because i don't have the access to that sql server...i can ask my colleague, but it's a bit bothering for me (and him) to ask him to clear the cache every time i have to test ](*,)
"Never memorize what you can easily find in a book".....Or Mibuso
My Blog
Indeed...
There are a few things you can check more.
You said you checked the times. Have you used the client monitor and filtered for Parameter 100 to see your elapsed time? What kind of durations do you get? Which is the largest? I usually filter for this parameter, mark those entries and then remove the filters and start going backwards to see the commands.
Also, how many entries get into your tmp tables? Is that a naming convention or are they actually declared as temporary?
P.S.: the hard coded filter ..120000 in the beginning is for testing purposes
"Never memorize what you can easily find in a book".....Or Mibuso
My Blog
I noticed that you don't use SETCURRENTKEY anywhere. Is t missing or is it in some other part of the code?
This could be a problem, but I always do the following:
Before closing the client monitor I create a dataport and export all monitor records (if there are too many and I cannot copy - paste).
I import the results in excel and investigate there.
Once the results showed me that all queries had low duration, but there were too many queries for blob fields. In that customer we didn't use any blob fields so I ended up disabling them. I also had to increase the object cache because the client was constantly downloading objects from the server.
Just saying what I've faced so far. Maybe all these don't apply to your case...
What is a lot by your standard? A few thousand, tenthousand, hundrets of housands...? If the last would apply I would say that the execution time of 20 sec. is acceptable. Try doing the same in SQL Server management studio and you´ll end up with a simmilar duration time.
20001 patients are processed in 20 seconds THROUGH CLASSIC CLIENT (a lot of time for such a small set)
but "only" 12 seconds THROUGH ROLE TAILORED CLIENT!!!!
are there more suggestions?
"Never memorize what you can easily find in a book".....Or Mibuso
My Blog
CLEAR(TBActPatientTemp);
CLEAR(TBSuspPatientTemp);
2)Make sure you have the three ISEMPTY in the optimal order. First the one that is most likely to be empty and so on.
If the RTC seems is faster than the Classic client with this code, it is most likely because the NAV server has a better connection to the SQL server than your pc has. But why are you shouting?
Peter
now, i'll answer to pdj those ensures that there are no filters/dirty fields on each loop...moreover, these lines of code does not affect performance: as i said, if i comment all (and only) the isempty instructions, the process takes about 5 seconds
EDIT: yes, it's superficial because after the clear, i assign the variable, but i don't know if some developer in future will modify some of these instructions, screwing up my code in an unwanted modification :roll: this was my first thought, too...and the reason of why i tried to comment the isempty instructions one by one: in order to find the more selective and set it as the first.
yes, correct, but why just one (one at a choice) isempty makes the whole process 4x slower?
to make evidence of the 2 phrases, and activating capslock is easier than clicking underline or bold
yes, i'm lazy! :whistle:
thanks to everyone for your effort, waiting for further suggestions :thumbsup:
"Never memorize what you can easily find in a book".....Or Mibuso
My Blog
Being lazy and curios is a great combination for developers! 8) :thumbsup:
It might help making 3 new flowfields of type exists (HasContactRC & HasSystemPatient & HasPatientTherapy) and a flowfield (DateFilter) and then set the filters directly on the main record. That might reduce the number of SQL transactions, but will for sure increase the workload on the SQL server! But are you curius enough?
Peter
Not sure about this. It would still need to evaluate for each record. It could reduce the number of loops at the end but I think that it would have the same or probably worse performance.
I don't know if NAV is able to do it server-side by creating a SQL join, but you are probably right.
Peter
NOTE: i think that here you misunderstood my post (or i misunderstood yours): i know that isempty does a top1null even if there are no records (this is obvious). The behaviour i expected was that increasing the number of ISEMPTY instruction would have increasingly make performance worse, but this is not true, because if i comment all isempties, execution is 5 seconds; if i uncomment one "random" isempty, execution is 20 secs. If i uncomment all 4 isempties, the execution is still 20 secs :-k
ehy wait!!!while writing, a thing got in my mind...i'll let you know...thanks for now
"Never memorize what you can easily find in a book".....Or Mibuso
My Blog
now i know why just one isempty made the performance fall down! because only one isempty was executed at any time!!!i changed a piece of code to test my thought: instead of nested isempties, i made sequential isempties and i evaluated a boolean, here's the code snippet:
as expected, the time increased from 20 to 55 seconds (roughly 20 seconds for each isempty) ](*,) ](*,)
did i explain clearly?thanks to both of you, because you guided me in the right direction, and now i'll go punishing myself for this crap!
"Never memorize what you can easily find in a book".....Or Mibuso
My Blog
my only option is to investigate if i can start the loop from one of the related tables, because it can have less record than the one i'm looping up to now...otherwise i've just to hope that the resources are fair enough on the real servers :-k
"Never memorize what you can easily find in a book".....Or Mibuso
My Blog
Edit: Didn't see you already posted the flowfield results. Thanks.
Peter
I did something similar recently, but in my case the number of entries in the subtables was pretty small and constant. I guess you already though of the following but I'll write it anyway: Investigate also the growth rate of the tables so you won't have to change it back later.
One thing I didn't get: Are you still having better performance for the same code under RTC? Maybe it is worth running SQL profiler to see the differences between the two client's executions.
"Never memorize what you can easily find in a book".....Or Mibuso
My Blog
What I've done:
1. set the date range on the (real) tables on which the isempty will be done.
2. populate temptables with the records resulting from point 1
3. loop the patient table and use ISEMPTY on the temptables.
4. added a nice progressbar which is always cool 8)
In this way i dropped the number of dbreads, thus decreasing a lot the data exchange between middle tier and db tier
POSSIBLE DRAWBACK: an excessive growth of the four temptables on which i do the isempty, can hang the client and eventually crash navision if the temptable reach 2GB.
"Never memorize what you can easily find in a book".....Or Mibuso
My Blog