SQL Profiler hi scores

BgestelBgestel Member Posts: 136
edited 2008-05-15 in SQL Performance
Let's post the hi scores we find when profiling sql server.

This is what i'm looking at right now
type==duration====read====writes====CPU
----------------------------------------------------------
SELECT 3188	470896	16	3172
SELECT 3137	470908	0	3078
SELECT 3145	468914	1987	3094
SELECT 3150	468133	2011	3125
SELECT 3515	468458	2005	3484
SELECT 3212	468459	2001	3203

Beat this 8) , can't be that hard, and play nice, only post real ones
**********************
** SI ** Bert Van Gestel **
**********************

Comments

  • kinekine Member Posts: 12,562
    This result is result of what? Which tool you have used?
    Kamil Sacek
    MVP - Dynamics NAV
    My BLOG
    NAVERTICA a.s.
  • BgestelBgestel Member Posts: 136
    sql server profiler,

    and notepad ;) for the fancy layout
    **********************
    ** SI ** Bert Van Gestel **
    **********************
  • kinekine Member Posts: 12,562
    Ok, but the SELECT was which select? What did you select?The numbers are nice, but without steps to set same queries etc. are about nothing... I can run profiler and catch numbers for any select in my database, but I cannot compare it with your result. Do you understand what I want to say? :wink:
    Kamil Sacek
    MVP - Dynamics NAV
    My BLOG
    NAVERTICA a.s.
  • BgestelBgestel Member Posts: 136
    @kine

    that is the beauty of it?

    In this case , it is a select using a source type sourno construct on a table in our solution.

    But that doesn't matter

    this post is just about sharing the pain with the community.

    btw ;) i allready found what was causing this, every table was a heap ](*,)
    **********************
    ** SI ** Bert Van Gestel **
    **********************
  • strykstryk Member Posts: 645
    Great idea! \:D/

    Date: 10.12.2007
    SELECT using Wildcard:

    Reads: 204.346 (1.634.768KB ! On SQL 2000 STD with 1,7 GB Target Server Memory)
    CPU: 859
    Duration: 936

    Causing a "Avg. Disk Write Queue Length" of 6045 and a "% Disk Time" of 604.518 ... means: this single query actually shut down the whole system ... ](*,)

    Not that many reads as yours, but currently the only query I have a record :wink:
    Jörg A. Stryk (MVP - Dynamics NAV)
    NAV/SQL Performance Optimization & Troubleshooting
    STRYK System Improvement
    The Blog - The Book - The Tool
  • krikikriki Member, Moderator Posts: 9,112
    Date 30/01/2008
    SELECT SUM(...)

    SQLServer 2000SP3 with +-5GB of memory.

    Reads:1240074
    CPU:5515
    Duration:15000

    There was a missing SIFT. And this SELECT was launched quite often.
    Page life expectancy dropped to almost 0 several times per hour. And the drives were almost continually at 100%.
    After fixing this one (and a lot of others), now the drives sometimes have a peak of about 70% for 2 or 3 seconds and they get back down.

    But this is not the worst I encountered. I've had a customer where even the Enterprise Manager didn't react any more because the server was overworked. They had to do everything through TSQL-statements.
    Regards,Alain Krikilion
    No PM,please use the forum. || May the <SOLVED>-attribute be in your title!


  • kinekine Member Posts: 12,562
    Ok, now I understand... :oops:
    Kamil Sacek
    MVP - Dynamics NAV
    My BLOG
    NAVERTICA a.s.
  • strykstryk Member Posts: 645
    Ha! Just last Sunday I got a real killer!

    Full Table Scan due to missing Index on super-size table:

    5.179.823 Reads (not kidding, no joke, kein Scheiss - I've got the TRC for evidence)

    This means:
    5.179.823 x 8KB (Page) = 41.438.584KB = about 40GB

    The query required about 40GB of RAM/cache, but unfortunately only 8RAM physical memory are available on that Server - that's what I call "hell on earth" ...
    Jörg A. Stryk (MVP - Dynamics NAV)
    NAV/SQL Performance Optimization & Troubleshooting
    STRYK System Improvement
    The Blog - The Book - The Tool
  • kinekine Member Posts: 12,562
    I have Highscore in reads:
    6332 duration
    13.102.036 reads
    0 Writes
    6282 CPU

    I do not know how the hardware can do that in this time... ;-)

    MS SQL 2005, 8x CPU (2 x Xeon), 16B RAM
    Kamil Sacek
    MVP - Dynamics NAV
    My BLOG
    NAVERTICA a.s.
  • garakgarak Member Posts: 3,263
    This is my high score

    Reads: 896517 with a duration of 1287 and cpu 1031

    reason: the developer has designed a bad key. After i changed the key and the source -> reads 789
    Do you make it right, it works too!
  • Miklos_HollenderMiklos_Hollender Member Posts: 1,598
    I've checked Profiler recently and interestingly it doesn't really show what I think it would show - am I doing something wrong? I just ran a codeunit which simply reads all entries in a ledger - takes about 10 secs. Ran the Profiler on it ticking SQL Batch Complete and SQL Statement Complete and I don't see the SELECT on the ledger table, just some other (very short) operations.
  • garakgarak Member Posts: 3,263
    The following Events i've selected:

    Stored Procedures
    - PRC:Completed
    - SP:Completed
    - SP:StmtCompleted

    TSQL
    - SQL:BatchCompleted
    - SQL:BatchStarting

    Filter:
    Reads -> Reater than or equal (for example 2000)

    activate "Exclude rows that do not contain values"
    Do you make it right, it works too!
  • Miklos_HollenderMiklos_Hollender Member Posts: 1,598
    Thanks! I think the big trick is then to look at the Reads instead of Duration.
  • garakgarak Member Posts: 3,263
    right. A hig Read Level must not have a high duration
    Do you make it right, it works too!
  • Miklos_HollenderMiklos_Hollender Member Posts: 1,598
    Not a profiler high score, but pretty high: 600MB of receipt lines, 99.9% is invoiced, 99.9% is from the same vendor, Get Receipts Lines, filters on vendor & qty. received not invoiced, but index is only on the vendor - so reading and in-memory sorting of 600MB :) I've put an index on that too, fingers crossed - my very first real-life optimization... :)
Sign In or Register to comment.