I'm currently trying to optimize indexes based on "expensive" queries found using scripts from Lars Lohndorf-Larsen. (
http://blogs.msdn.com/b/nav/archive/200 ... story.aspx)
This database is rather big, so I can't just create and drop indexes in the live database, so I would like to do some testing in a test database first. However, after having restored the live-db into a new test-db, then SQL doesn't show the same estimated execution plan in the two databases.
Both the data and the statistics are identical, so how can the estimated execution plans be different?
And how do I fix it, so I can reproduce the same situation in my test-db?
Thanks in advance.
Comments
The data you need is in RAM, which obviously doesn't come across. Maybe there is some way, but I don't think so ... but am very interested if there is. What you would need to do is to create artificial load to replicate the circumstances that got the live system where it is.
Thanks for you answer, but do you know what this information is called? Just so I know what to search for... :?
I guess I could do a long recording using the Profiler and replay it in the db, but I would like to avoid slowing down the live-db more then necessary. I guess I have to do my testing on the live-db outside working hours, if I don't find a way to copy this information
Peter
The only way I have found to correctly tune a database is to do it in live. I use the SQL Perform Analysis tool, which sits in the back ground and gathers all the statistics, but it is of no use in a test db for exactly the reasons you mention.
And then comes blocking...
And then the channel to you storage array...
I guess I have to do it in the live-db, if the customer can find a service window I can live with
Peter
RIS Plus, LLC