Slow SQL-Server 2005

Rene70Rene70 Member Posts: 25
Hi guys,

our SQL-Server drives me crazy. One day everything works fine and next day - no changes were made - everything slows down - ok not everything, but esp. looking at the item ledger entry.

We are using SQL 2005, navision.exe 4.0SP2 (Version 4.0.2.22611)
Our database has a size of round about 50 GB.

The Perfmon show that the the buffer cache hit ratio goes down to 80% (my NSC told me, that it should normaly be at 98-99%)
Then the latch waits/sec goes up from 0-2 to 25-50. Is this normal?

What I'm also interested in, is the question, why the server only use 2 GB RAM (there are still 5 GB available) that means, why doesn't use it more RAM for caching?

Thanks for help

René

Comments

  • DenSterDenSter Member Posts: 8,307
    2GB RAM is a Standard Edition limit. If you want SQL Server to use more than that, you need to purchase Enterprise Edition.

    It might be because of SQL Server expanding the files, or because you run some kind of script that reindexes or something like that.
  • ara3nara3n Member Posts: 9,257
    Hello Denster. SQL 2005 standard edition memory depends on OS memory limit. sql 2000 had 2k limit.

    http://www.microsoft.com/sql/prodinfo/features/compare-features.mspx
    Ahmed Rashed Amini
    Independent Consultant/Developer


    blog: https://dynamicsuser.net/nav/b/ara3n
  • Rene70Rene70 Member Posts: 25
    That right. We do have a SQL2005 Standard Edition, but a Windows2003 Server Enterprise Edition (by the way - not one of the cheapist products :cry: ).

    Is it possible, that I have to activate the higher memory (AWE option) or does the sql server do this by his own?
  • krikikriki Member, Moderator Posts: 9,118
    Rene70 wrote:
    The Perfmon show that the the buffer cache hit ratio goes down to 80% (my NSC told me, that it should normaly be at 98-99%)
    Then the latch waits/sec goes up from 0-2 to 25-50. Is this normal?
    Maybe someone is running a report on ALL the data in the DB?
    Regards,Alain Krikilion
    No PM,please use the forum. || May the <SOLVED>-attribute be in your title!


  • Rene70Rene70 Member Posts: 25
    No, I tested it right now in the evening, when nobody is working. I open the item ledger entries for an item and i can see the phenomen.
    At the moment I'm updating the statistics. I saw that the SQL-server-agent stopped his job 3 days ago and did not update the table-statistics. Maybe this will help.
  • krikikriki Member, Moderator Posts: 9,118
    Rene70 wrote:
    No, I tested it right now in the evening, when nobody is working. I open the item ledger entries for an item and i can see the phenomen.
    At the moment I'm updating the statistics. I saw that the SQL-server-agent stopped his job 3 days ago and did not update the table-statistics. Maybe this will help.
    Definitly!
    I had a client : Monday all ok, Tuesday too, but from Wednesday until Friday it went worse by the day to be ok on Monday. I did only a SQL-statistic in the WE. From that moment, I did it all days and all worked well.
    Regards,Alain Krikilion
    No PM,please use the forum. || May the <SOLVED>-attribute be in your title!


  • Rene70Rene70 Member Posts: 25
    It's impressing. Updating statistics really solved the problem :D:lol::D
  • Rene70Rene70 Member Posts: 25
    By the way. I read a lot of material and by default the sql-server allocates only 4GB memory. To increase this frontier you have to enable AWE. Before you can activate that you have to give the privileg "lock pages in memory" to the user that runs the sql-server-service by using the gpedit.msc.
  • nunomaianunomaia Member Posts: 1,153
    Rebuilding the indexes periodically also increases performance. There are good documents when to update the statistics and when do rebuild the indexes. I normally update all statistics during the night and during the weekend rebuild the indexes.
    Nuno Maia

    Freelance Dynamics AX
    Blog : http://axnmaia.wordpress.com/
  • krikikriki Member, Moderator Posts: 9,118
    nunomaia wrote:
    Rebuilding the indexes periodically also increases performance. There are good documents when to update the statistics and when do rebuild the indexes. I normally update all statistics during the night and during the weekend rebuild the indexes.
    I do the same
    Rene70 wrote:
    It's impressing. Updating statistics really solved the problem :D:lol::D
    I was also much surprised that SQL suffers so much performance-problems if you don't update SQL statistics enough. I thought once a week was enough.
    Regards,Alain Krikilion
    No PM,please use the forum. || May the <SOLVED>-attribute be in your title!


  • Rene70Rene70 Member Posts: 25
    Last statement to this topic:

    The problem occured during the day again although updating statistics and indexes. :(
    What I did at last is shut down the SQL-Server and started it again. From that point everything works fine (up to now) :P
    I think it is a microsoft problem and I'm waiting for the next servicepack. :?
  • nunomaianunomaia Member Posts: 1,153
    I have noticed that you are using NAV SP2 build 22611. Microsoft has released an update for NAV 4.0 SP2, especially for SQL 2005. The lasted build for SP2 is 22979.
    Nuno Maia

    Freelance Dynamics AX
    Blog : http://axnmaia.wordpress.com/
  • Rene70Rene70 Member Posts: 25
    I do have a release with build 23099. But the white paper said, that it is only an update to the security system (Standard Security / Enhanced Security (NEW)).
    Are there other "features" in that build?
  • thaugthaug Member Posts: 106
    Updating stats at night is not a very expensive operation from what I have seen. It takes my relatively slow server about 10 minutes to complete, very easy to schedule.

    I also didn't realize that not updating the stats would have such an impact.
    There is no data, only bool!
  • krikikriki Member, Moderator Posts: 9,118
    thaug wrote:
    Updating stats at night is not a very expensive operation from what I have seen. It takes my relatively slow server about 10 minutes to complete, very easy to schedule.

    I also didn't realize that not updating the stats would have such an impact.
    Standard SQL proposes 10% of data. I take 30% of data and if I see it takes too much time, I can still go down to 10%. But I think it is best NOT to go under that.
    Regards,Alain Krikilion
    No PM,please use the forum. || May the <SOLVED>-attribute be in your title!


  • Scott_FrappierScott_Frappier Member Posts: 90
    Rene70:

    Build 23099 (Update 03 for Dynamics-NAV 4.0 SP2) has two modes for a security structure...one is "Standard" and the other is "Enhanced". As you've already read the white paper, I will not get into details, but the Standard security is a much needed answer to the synchronization issues that clients have encountered in the past. The Standard security model is probably just the same security model that was used in versions prior to 4.0, and worked great.

    When you install Update 03, it re-creates all of the SIFT tables in the database. This hints at possible optimizations/data corrections to the existing data, as it is a very extensive update. Sadly, Microsoft does not give a "change log" with the update that states specifically what was fixed. I'm sure that this exists, but I have never found it through any of my contacts.

    I would highly recommend that you upgrade to Dynamics-NAV 4.0 SP2 Update 03 (Build 23099) as there are known issues with SQL Server 2005 and all versions prior to this update. Some Dynamics-NAV functions are broken prior to build 23099, and can cause you to have some extremely strange results.

    As for your performance issues, do you have detailed specifications on your environment? The first place to start is with hardware (RAID arrays, et cetera), and from there look into the database configuration. Are you optimizing indexes constantly? Are you filling/padding any indexes? Was auto grow enabled in your database? Do you have fragmentation within the database, and is it extent fragmentation or logical fragmentation?

    You could run the following command (change the company name to the proper company in your database) and post the results into this thread to try to determine why the Item Ledger Entry table is slowing down...
    DBCC SHOWCONTIG ('CRONUS USA, Inc_$Item Ledger Entry') WITH TABLERESULTS, ALL_INDEXES
    

    After restarting a server and opening a table, remember, that there will be a "pause" as the SQL Server starts buffering the indexes within the database for large tables. This is sometimes called the "warm-up" phase, but after this has occurred, the database should perform acceptably with this happening rarely.

    Anyways, feel free to message me directly if you have any specific questions (or post in this thread of course :) ).
    Scott Frappier
    Vice President, Deployment Operations

    Symbiant Technologies, Inc.
    http://www.symbiantsolutions.com
  • GoMaDGoMaD Member Posts: 313
    Hi, we have a 4.00 SP2 (with client roll up 2) database of about 81 GB and it's running under SQL 2005.

    I would like to run the Updat Statistics everynight but it runs for about 18 hours (no kidding).

    When I create the Maintenance Plan and add the "Update Statistics Task" I get a change to select the Database (no problem there :wink: ) But I also can choose one of the three following possebilities:
    A) All Existing Statistics
    B) Column Statistics Only
    C) Index Statistics Only

    Can I choose B or C to speed up the task (my guess is that C is the only one Navision uses, but I'm not sure) or must I choose A (which then takes about 18 hours)
    Now, let's see what we can see.
    ...
    Everybody on-line.
    ...
    Looking good!
  • Scott_FrappierScott_Frappier Member Posts: 90
    GoMaD:

    I think there are a couple of ways to address your problem...I'll put my 2 cents in and then I'm sure that someone else will want to as well.

    *NOTE: My Server is these examples is SYMBIANTISV and the database that I am working in is "nTier Architecture Services (4_0 SP2 Update 03)"

    The way that I would address this issue would be to create a "maintenance table". This table would be basically similar to the following:
    USE [nTier Architecture Services (4_0 SP2 Update 03)]
    GO
    /****** Object:  Table [dbo].[Statistic_Update]    Script Date: 11/16/2006 15:07:55 ******/
    SET ANSI_NULLS ON
    GO
    SET QUOTED_IDENTIFIER ON
    GO
    SET ANSI_PADDING ON
    GO
    CREATE TABLE [dbo].[Statistic_Update](
    	[Table_Name] [varchar](80) COLLATE Latin1_General_CS_AS NOT NULL,
    	[Statistic_Update_Group] [int] NOT NULL,
     CONSTRAINT [PK_Statistic_Update] PRIMARY KEY CLUSTERED 
    (
    	[Table_Name] ASC,
    	[Statistic_Update_Group] ASC
    )WITH (PAD_INDEX  = OFF, IGNORE_DUP_KEY = OFF) ON [Data Filegroup 1]
    ) ON [Data Filegroup 1]
    
    GO
    SET ANSI_PADDING OFF
    

    In this table, you would populate the tables that you would like to have the statistics updated, and when they should update the statistics. After you have populated this (if a table did not exist in this list, then it would not be processed). The group is a number that you assign the tables to, so that you can then choose (based upon a job setup) on how to optimize the tables.

    Let's say that we want Group 1 to update every Monday, Wednesday, and Friday. We want Group 2 to update every Tuesday and Thursday. We also want all tables to be updated on Saturday night. We need to do two things to accomplish this. We need to create a stored procedure that handles the update, and then create jobs that call this stored procedures with parameters. The stored procedure could look something like this:
    USE [nTier Architecture Services (4_0 SP2 Update 03)] 
    GO
    
    CREATE PROCEDURE sp_navstatupdate 
      @navgroupfilter int 
    AS
    
    SET XACT_ABORT ON
    SET NOCOUNT ON
    
    DECLARE @statistic_tables CURSOR
    DECLARE @navtablename VARCHAR(250)
    DECLARE @execSQL VARCHAR(4000)
    
    /* Cursor based optimizations (verbose for administrative purposes) */
    SET @statistic_tables = CURSOR FOR
      SELECT Table_Name FROM Statistic_Update WHERE (Statistic_Update_Group = @navgroupfilter)
    OPEN @statistic_tables
    FETCH NEXT FROM @statistic_tables INTO @navtablename
    WHILE (@@FETCH_STATUS = 0)
    BEGIN
      PRINT('Processing table: ' + @navtablename)
      
      SET @execSQL = 'BEGIN TRANSACTION UPDATE STATISTICS [' + @navtablename + '] COMMIT TRANSACTION'
      EXEC(@execSQL)
      
      FETCH NEXT FROM @statistic_tables INTO @navtablename
    END
    GO
    

    Now that I have the base code done, I would go ahead and add a job that executes on Monday, Wendesday, and Friday...and it would look as follows:
    USE [msdb]
    GO
    DECLARE @jobId BINARY(16)
    EXEC  msdb.dbo.sp_add_job @job_name=N'Update Statistics on Group 1', 
    		@enabled=1, 
    		@notify_level_eventlog=0, 
    		@notify_level_email=2, 
    		@notify_level_netsend=2, 
    		@notify_level_page=2, 
    		@delete_level=0, 
    		@description=N'Updates the statistics in the Group 1 setup in the Statistics_Update table in the Dynamics-NAV database.', 
    		@category_name=N'[Uncategorized (Local)]', 
    		@owner_login_name=N'SYM\sfrappier', 
    		@notify_email_operator_name=N'Scott Frappier', @job_id = @jobId OUTPUT
    select @jobId
    GO
    EXEC msdb.dbo.sp_add_jobserver @job_name=N'Update Statistics on Group 1', @server_name = N'SYMBIANTISV'
    GO
    USE [msdb]
    GO
    EXEC msdb.dbo.sp_add_jobstep @job_name=N'Update Statistics on Group 1', @step_name=N'Run Statistics Update', 
    		@step_id=1, 
    		@cmdexec_success_code=0, 
    		@on_success_action=1, 
    		@on_fail_action=2, 
    		@retry_attempts=0, 
    		@retry_interval=0, 
    		@os_run_priority=0, @subsystem=N'TSQL', 
    		@command=N'exec sp_navstatupdate ''1''', 
    		@database_name=N'nTier Architecture Services (4_0 SP2 Update 03)', 
    		@flags=0
    GO
    USE [msdb]
    GO
    EXEC msdb.dbo.sp_update_job @job_name=N'Update Statistics on Group 1', 
    		@enabled=1, 
    		@start_step_id=1, 
    		@notify_level_eventlog=0, 
    		@notify_level_email=2, 
    		@notify_level_netsend=2, 
    		@notify_level_page=2, 
    		@delete_level=0, 
    		@description=N'Updates the statistics in the Group 1 setup in the Statistics_Update table in the Dynamics-NAV database.', 
    		@category_name=N'[Uncategorized (Local)]', 
    		@owner_login_name=N'SYM\sfrappier', 
    		@notify_email_operator_name=N'Scott Frappier', 
    		@notify_netsend_operator_name=N'', 
    		@notify_page_operator_name=N''
    GO
    USE [msdb]
    GO
    DECLARE @schedule_id int
    EXEC msdb.dbo.sp_add_jobschedule @job_name=N'Update Statistics on Group 1', @name=N'Monday, Wednesday, and Friday', 
    		@enabled=1, 
    		@freq_type=8, 
    		@freq_interval=42, 
    		@freq_subday_type=1, 
    		@freq_subday_interval=0, 
    		@freq_relative_interval=0, 
    		@freq_recurrence_factor=1, 
    		@active_start_date=20061116, 
    		@active_end_date=99991231, 
    		@active_start_time=0, 
    		@active_end_time=235959, @schedule_id = @schedule_id OUTPUT
    select @schedule_id
    GO
    
    

    My table setup (just added a couple of records for this example) would look as follows:
      Table_Name: CRONUS USA, Inc_$XBRL Schema
      Statistic_Update_Group: 1
      Table_Name: CRONUS USA, Inc_$XBRL Linkbase
      Statistic_Update_Group: 1

      Table_Name: CRONUS USA, Inc_$XBRL Taxonomy
      Statistic_Update_Group: 2


      After I execute the job...I get the following:
      Date		11/16/2006 3:11:07 PM
      Log		Job History (Update Statistics on Group 1)
      
      Step ID		1
      Server		SYMBIANTISV
      Job Name		Update Statistics on Group 1
      Step Name		Run Statistics Update
      Duration		00:00:00
      Sql Severity		0
      Sql Message ID		0
      Operator Emailed		
      Operator Net sent		
      Operator Paged		
      Retries Attempted		0
      
      Message
      Executed as user: SYM\sfrappier. Processing table: CRONUS USA, Inc_$XBRL Linkbase [SQLSTATE 01000] (Message 0)  Processing table: CRONUS USA, Inc_$XBRL Schema [SQLSTATE 01000] (Message 0).  The step succeeded.
      

      This gives you complete flexibility so that you can schedule the updating of statistics in a group. This means that you do not "shortcut" the process and you do the entire update. This is the way that we have setup some of our other customers, and how we manage large installations that have large databases (by doing backups in filegroups).

      Dynamics-NAV does not do a statistics update...when you "optimize" the tables, it is actually issuing a command similar to the following:
      CREATE UNIQUE CLUSTERED INDEX "Symbiant$Customer$0" ON "Symbiant$Customer" ("No_") WITH DROP_EXISTING
      
      The statistics are used by the query optimizer to compute execution plans within the server. Since Dynamics-NAV does not manage this, there is a possibility that any and all statistics could be used in one statement to optimize a SQL Query. We always recommend to do all existing statistics, but have not experimented with the FULLSCAN option.

      I hope all of this information helps...there are many other ways to address this solution.

      In fact, you could probably use the new PowerShell that Microsoft released today to automate this entire procedure from a command line :).

      Thanks!

      - Scott
      Scott Frappier
      Vice President, Deployment Operations

      Symbiant Technologies, Inc.
      http://www.symbiantsolutions.com
    • GoMaDGoMaD Member Posts: 313
      Thanks Scott !

      I'm going to try to accomplish this for our database.

      One thing I forgot to mention it has 30+ companies in it
      Now, let's see what we can see.
      ...
      Everybody on-line.
      ...
      Looking good!
    • Scott_FrappierScott_Frappier Member Posts: 90
      GoMad:

      30+ Companies...not a problem! :)

      Create 30 groups and schedule certain groups to run every day/every other day. Since Dynamics-NAV creates seperate tables for companies, you just need to modify the Update_Statistics table so that it has the proper information.

      - Scott
      Scott Frappier
      Vice President, Deployment Operations

      Symbiant Technologies, Inc.
      http://www.symbiantsolutions.com
    • krikikriki Member, Moderator Posts: 9,118
      Dynamics-NAV does not do a statistics update...when you "optimize" the tables, it is actually issuing a command similar to the following:
      CREATE UNIQUE CLUSTERED INDEX "Symbiant$Customer$0" ON "Symbiant$Customer" ("No_") WITH DROP_EXISTING
      
      It does something more than that.
      It also checks the SIFT-tables and deletes records in which all decimal fields are 0.
      Regards,Alain Krikilion
      No PM,please use the forum. || May the <SOLVED>-attribute be in your title!


    • Scott_FrappierScott_Frappier Member Posts: 90
      kriki:

      It still uses the same syntax for the SIFT indexes as well. The SIFT triggers are always updated via the triggers established on the table...if a record is deleted the SIFT value is not deleted, but an UPDATE is called to update the record (it never removes it as you have stated). The reason for this is because a DELETE statement is much more "costly" then an update statement. The "optimization" looks for all records in which a zero value is present at the end of the optimization of the index.

      When you reindex/optimize a table, the following occurs:

      1.) The transaction level is set to SERIALIZABLE:
      SET TRANSACTION ISOLATION LEVEL SERIALIZABLE
      
      2.) The index is found using the following statement:
      exec sp_executesql N'SELECT INDEXPROPERTY([id], [name], ''IsClustered''),              INDEXPROPERTY([id], [name], ''IsUnique'')       FROM   "Symbiant".[dbo].[sysindexes]       WHERE  id = OBJECT_ID(@P1) AND [name] = @P2', N'@P1 nvarchar(34),@P2 nvarchar(2)', N'"LR Data, Inc_$Cust_ Ledger Entry"', N'$1'
      
      3.) The index on the table (such as Cust. Ledger Entry) is optimized with the following:
      CREATE UNIQUE NONCLUSTERED INDEX "$1" ON "Symbiant"."dbo"."LR Data, Inc_$Cust_ Ledger Entry" ("Customer No_","Posting Date","Currency Code","Entry No_") WITH DROP_EXISTING
      
      4.) The SIFT indexes are computed by the client, and then the following commands are issued:
      CREATE UNIQUE CLUSTERED INDEX "LR Data, Inc_$21$0_idx" ON "Symbiant"."dbo"."LR Data, Inc_$21$0" ("bucket","f3","f4","f11") WITH DROP_EXISTING
      CREATE   INDEX "LR Data, Inc_$21$0_hlp_idx" ON "Symbiant"."dbo"."LR Data, Inc_$21$0" ("f4") WITH DROP_EXISTING
      
      5.) The "zero filled" values are deleted from the table:
      DELETE FROM "Symbiant"."dbo"."LR Data, Inc_$21$0" WHERE (s18=0 AND s19=0 AND s20=0)
      
      6.) The optimization is committed to the database:
      IF @@TRANCOUNT > 0
        COMMIT TRAN
      

      As you can see, you could easily reproduce this with a script that would do exactly the same as what the Dynamics-NAV client does...it's not that much more then what I generalized before...

      Remember...sometimes you may _not_ want to delete the 0 values in the SIFT tables. If you have a table that has heavy inserts which result in heavy inserts into the SIFT table, you could be degrading performance. UPDATE transactions are always faster then that of the INSERT (as the index pointer does not need to be updated...just the data pages) and DELETE transactions. There may be some situation in which you do not want the SIFT table values that are zero to be deleted. You would need to research this thoroughly before making that decision though...

      Thanks for pointing out my generalization...hopefully this will give insight as to what really happens when an optimization occurs within Dynamics-NAV...I was just trying to keep it simple in my previous post...

      Thanks!

      - Scott
      Scott Frappier
      Vice President, Deployment Operations

      Symbiant Technologies, Inc.
      http://www.symbiantsolutions.com
    • WaldoWaldo Member Posts: 3,412
      Scott Frappier, this is really useful stuff. I really appreciate it.

      Eric Wauters
      MVP - Microsoft Dynamics NAV
      My blog
    • krikikriki Member, Moderator Posts: 9,118
      Waldo wrote:
      Scott Frappier, this is really useful stuff. I really appreciate it.
      So do I! =D> =D> =D>
      Regards,Alain Krikilion
      No PM,please use the forum. || May the <SOLVED>-attribute be in your title!


    • thaugthaug Member Posts: 106
      I'd run that update statistics through the profiler, or monitor the performance of the server closely, since 18 hours seems way to long. You could probably rebuild all the indexes in less time. On a new server that I'm testing getting ready to deploy, the process took about 4 minutes (25 GB of data). I'm thinking that there is some kind of bottleneck in the disk or memory subsystem.

      You may also want to try SP3.

      And Scott, thanks for that very useful info.
      There is no data, only bool!
    • WaldoWaldo Member Posts: 3,412
      We do it this way now, and it only runs for about an hour (on the "hot" tables).

      (yes, I'm writing this in the name of my colleague GoMaD :wink: )

      Eric Wauters
      MVP - Microsoft Dynamics NAV
      My blog
    Sign In or Register to comment.