Upgrading from 2009R2 to 2013 taking a LONG time

CharlehCharleh Member Posts: 44
edited 2013-10-08 in SQL Performance
I've already posted on the Three Tier forum and someone suggest I repost here, so apologies for any duplication:

viewtopic.php?f=32&t=59121&start=0

To summarise the issue I'm having, I'm trying to upgrade a NAV2009R2 database to 2013, but at the point of doing the Ledger Entry Dimension upgrade, the UI freezes up (obviously it's busy) and SQL/CPU is being hammered. The problem I have is that I think nothing is actually wrong, but that due to there being over 68 million dimensions in the table there is a lot of processing to be done.

I've run this for a couple of days on a pretty decent machine and with no joy (I had to shut the machine down in the end as it's my work laptop - the development server we have is actually lower spec than my dev laptop)

Does anyone know how long this should take? Is there any good way of speeding the process up? I don't have a lot of time to investigate at this moment so I've not looked at the execution plan in great detail, but I can see that it doesn't seem complicated and there don't appear to be any unnecessary scans - all the work is in clustered index seeks

I'll drop the details from my previous posts here too:
I've put the DB in simple recovery model and expanded the size of the database to be twice that of the original data (the size is now over 100GB) - is this a normal amount of time to wait before 1% on an upgrade of this size? I've already moved the data from NAV4 to NAV2009R2 which only took a couple of hours. I need to take into account the time to upgrade to 2013 when planning the production database upgrade - and having the upgrade take more than two days is going to cause all sorts of problems for the client (ideally I'd like to get it done in a day!)
I'm running everything locally - the machine that is running this processed a v400 to 2009R2 update in a matter of minutes for the 50gig db (it took a few minutes for upgrade step 1 for the biggest company and less than an hour to finish the second upgrade step)

My company database/tempdb are not growing and the machine performance is as so:

30-35% cpu usage (most of that is SQL server)
at points 100% disk utilisation, at the moment not much
almost 100% ram utilisation

I'm on decent machine running 2 quad core i7s and an SSD - 16gig of ram, and like I said - the first upgrade took no time at all.

I've left this running all day and overnight and it's still on the 'preparing for upgrade' stage (0% progress)

Checking SQL shows the following UPDATE statement is running:



Code: Select all

update [Company Database$upgrade355_Dim] set DimSetID= (select DimSetID from [Company Database$upgrade355_DimSetID] as D where ((D.DimVal1 = [Company Database$upgrade355_Dim].DimVal1) or (D.DimVal1 is null) and ([Company Database$upgrade355_Dim].DimVal1 is null)) and ((D.DimVal2 = [Company Database$upgrade355_Dim].DimVal2) or (D.DimVal2 is null) and ([Company Database$upgrade355_Dim].DimVal2 is null)) and ((D.DimVal3 = ... etc


There don't appear to be any resource waits - I've run the query that SQL management studio runs to get a list of locks for process monitor, and the only entry is:

DB: Company Database
Status: RUNNING
Cmd: UPDATE
App: .Net SqlClient Data Provider
CPU: 290888
I/O (MB): 106
Last Req: 2013-10-01 21:27:08.693

So as far as I can see, SQL is actually just processing.

Checking the stats for IO I get

CPU ms/sec: 949
L. Writes/sec: 2
L. Reads/sec: 831404

Execution plan says that the heavy hitting operation is a key lookup against a clustered index which says it estimates at 80 million executions - this is taking up 79% of the query cost

- in summary, it looks like SQL is doing something, but it's been doing this for hours and hours - is there anything I can do to speed this up?
Charleh

Comments

  • Marije_BrummelMarije_Brummel Member, Moderators Design Patterns Posts: 4,262
    I would first run the tool on a real server. If your laptop is faster than your devserver it should not be that hard to find a decent server to run this on.

    Maybe you can run on the production machine of the customer on a sunday or something like that.

    Otherwise you can go to upgrade service centers who have servers that can run this routine
  • pdjpdj Member Posts: 643
    I know it does not mention your exact problem, but have you seen this article?
    http://www.dynamics.is/?p=1500
    Regards
    Peter
  • CharlehCharleh Member Posts: 44
    Well - after a few days of running on a dedicated server with no interruptions, it finally got past the Ledger Entry Dimensions

    I'm now getting an error on the Document Dimension table so I'm having to try and debug (it's not making a lot of sense at the moment) but that's another story!

    Let's hope it's quicker on the production server!

    Thanks for everyone's input
    Charleh
  • CharlehCharleh Member Posts: 44
    Ok so after some messing about trying to figure out where it was going wrong, I found a load of old Document Dimension Archive records that didn't have any corresponding dimensions in the dim value table (and therefore in Dim Set Entry)

    Deleting these got everything going and finally it finished!

    Thanks to anyone who had a look :)
    Charleh
  • JagadeeshJagadeesh Member Posts: 1
    Hi,

    I am also stuck at 2nd upgrade toolkit as "Ledger Entry Dimensions" are preparing to update.

    Please suggest me the solution.

    My DB Size is 110 GB. "Ledger Entry Dimensions" table has 3,56,76,241 records.

    Thanks & Regards,
    Palla Jagadeesh.
  • Slawek_GuzekSlawek_Guzek Member Posts: 1,690
    The code behind creates two temporary tables and few indexes on them, before writing Dimension Set Entry. The memory on SQL Server may help with that.

    We have 67m entries in T355 and it takes about 35 minutes to convert it - the test server I'm using has 4vCPUs, 80GB of RAM and SQL on it is limited to 64GB
    Slawek Guzek
    Dynamics NAV, MS SQL Server, Wherescape RED;
    PRINCE2 Practitioner - License GR657010572SG
    GDPR Certified Data Protection Officer - PECB License DPCDPO1025070-2018-03
  • MauritsioMauritsio Member Posts: 27
    Some years ago I had a problem when running data conversion for db (from nav2005->nav2009). It took too much time. At that time I got a tool to disable secondary keys that decreased the time of data conversion hugely. After conversion I enabled secondary keys.
  • kenniekennie Member Posts: 34
    We also have had reports of partners/customers utilizing the "Buffer pool extension" feature of SQL Server (2014-) with success during upgrades.

    See
    https://docs.microsoft.com/en-us/sql/database-engine/configure-windows/buffer-pool-extension
    for more info about the feature
Sign In or Register to comment.