Hi, I'm running Navision 3.70. Over the past few months, it has been running very, very slow. I'm getting numerous locks in SQL server more and more often - doesn't seem to release as quickly? A maximum of 81 sessions are able to log on to Navision to access the data. My database information is as follows:
Database Used (KB): 56,598,080 (97%)
Database Size (KB): 58,579,200
Object Cache (KB): 32000
Server Specs:
Intel Xeon 2.8GHz Dual Quad
4GB RAM
Windows Server 2003 64-Bit SP1
SQL Server 2000
Users are accessing it both via Terminal Server and locally via 100Mbit network.
If I run the "Optimize" function in Navision on some of my larger tables, will this help? I have a very large database.
0
Comments
What about the Transaction Log files? Do they have space enough, so that you will be able to work? This could also be a cause for a slower system.
Do you perform any optimization by SQL? Do you use INDEXDEFRAG? DBREINDEX?
If you never optimize your database, the indexes will get more fragmented and thereby your database will get slower.
So anykind of optimization will always be a good idea :-)
And it would also be a good idea to expand the database size due to the fact that space used is 97% (unless it grows automatically - then this is not needed ;-) )
My techblog
Meet me @ LinkedIn
I found that the Tlog was huge compared to the db because we forgot to backup the transaction log file, so SQL server was extending the TLOG causing slow performance.
Fred
as bbrown says, the 97% is a bad sign. Not just in it self, but in my experience, when I come across a server that has fundamental mistakes in its configuration, generally indicates that there are going to be many more to find.
You really need to get in a SQL expert to review your whole system, hardware, setup and then start looking at Navision, because generally when the hardware is bad, it is followed closely by bad software.
Well, actually the system specs do not even fulfil the minimum requirements according to the NAV Hardware Sizing Guide: with 80 users and 56 GB database you should have ...
... 4 CPU
... at least 8GB RAM (better 16)
... Gigabit LAN
... SQL Server 2000 Enterprise Edition
Further you need to have a sufficient disk-subsystem. And of course, everything needs to be configured properly.
(
Regarding the TS (NAV Requirements only):
- 10 Users per CPU
- 64MB RAM per User
- 500MB DiskSpace
)
If you run SQL 2000 EE you should consider to put in ore RAM! If you are running SQL 2000 STD, well then you are limited to just 2GB - which means you are screwed ](*,)
As sort of "first aid" you should look into SIFT optimization by reducing the number of SIFT buckets (see http://dynamicsuser.net/blogs/stryk/archive/2007/10/28/included-columns-vs-sift-tables.aspx)
Additionally it is crucial to run periodic system maintenace: defragment indexes (DBCC DBREINDEX) and update statistics; clean up SIFT tables - this could be done via SQL Maintenance Plan; except for the SIFT maintenance, which could be done with this http://www.mibuso.com/dlinfo.asp?FileID=812
(The "Table Optimizer" is re-creating the indexes similar to the DBCC DBREINDEX thing plus cleaning up the SIFT - but I'd prefer to run separate automatic jobs for those tasks, not fiddling with the NAV GUI)
Maybe you could also consider upgrading - just technically the C/SIDE version - to NAV 4.0 SP3+ (latest version; see http://dynamicsuser.net/blogs/waldo/archive/2009/05/08/platform-updates-overview-3-70-b-nav2009.aspx)
With this version you would be enabled to use advanced C/AL commands (and more) to improve the general communication between NAV and SQL (have in mind that higher versions as 5.0 etc. are designed for SQL 2005!) ...
But then again: if the platform is too weak, you'll never experience good performance, regardless how much you tune the application ...
Regards,
Jörg
NAV/SQL Performance Optimization & Troubleshooting
STRYK System Improvement
The Blog - The Book - The Tool
We are running SQL Server 2000 Standard Edition, our transaction log files have a size of 17.6GB.
Our largest table in Navision is the "Attachment" table with a size of 31,548,952KB - 31.5GB and 423,106 records. So this probably makes up the bulk of the total size of the database which is 60.7GB.
The database is on auto-grow...
Database Used (KB): 52,884,480 87%
Database Size (KB): 60,832,000
We have 6x SCSI Hard Disks configured in RAID5. I've ran the "Optimize" functionality on some of my larger tables for example, Value Entry - 2.9GB and Item Ledger Entry - 429MB - doesn't seem to have had much effect?
We've ran quite a few queries and done some reports on our SQL server, trying to find out which tables lock the most - because recently we've been receiving a lot of locks, stopping a lot of our users from posting transactions. The test results ended up going down the pan, because they were conducted during our scheduled report runs.
There seems to be a lot of locks on our ledger tables, for some reason it feels like the SQL server fails to release the resources/records in time or does it too slowly resulting in locks resulting in the Navision client crashing. On large tables, when you try to execute the form, data is rendered quite slowly across the screen - this is not via Terminal Services.
We are currently in the process of actually looking into purchasing new hardware and software, including purchasing the 64-bit version of SQL server and getting much faster disks.
Are you implying that the DB and LOG are on a shared RAID 5 array?
We mounted the same database on SQL Server 2008 64-Bit and certain reports which operate on the ledger tables completed 4x as fast and some twice as slow... (which was odd!).
I'm starting to think that this issue regarding speed/performance is related to certain problem tables with keys and that optimization is required for these tables which are causing locks. I don't believe it is a hardware issue or that SQL server is broken.
With SQL Server 2000 Standard the server can only use 1,7 GB RAM at maximum!
Hence, - for example - if some bad query provokes an index scan on "Attachment" (31GB) this totally screws up the cache, causing huge physical IO, putting high pressure on the CPU etc. - actually this means your whole system is shut down ... ](*,)
And as David mentioned, the whole other config is terrible insufficient; like using RAID5 or having the Transaction Log not isolated ...
It seems this server platform is simply not able to deal with this database and transaction volume. I dare to conclude, that the only real solution is to upgrade the whole Server - new hardware, new storage and new SQL Server version/edition ...
NAV/SQL Performance Optimization & Troubleshooting
STRYK System Improvement
The Blog - The Book - The Tool
Of course, you could try to fish out these poor queries using SQL Profiler and try to add optimzed indexes for those queries to reduce the number of Reads ...
NAV/SQL Performance Optimization & Troubleshooting
STRYK System Improvement
The Blog - The Book - The Tool
Our server has 8x CPUs - it's an Intel Xeon 2.8GHz Dual Quad, although old - I think it could last another 5 years. Our Navision server has recently been upgraded from 4GB to 16GB RAM - so we just need SQL Server 64-Bit to test the performance and its utilisation of RAM.
The tests were ran after hours however.... :shock:
Do not rely on Auto-Grow to expand data and transaction log files. Auto-Grow should only be considered a safety net. It is there in case an unexpectedly large process suddenly consumes the available file space. Watch the available space and expand manually during low system activity.
Auto-Grow runs when the free space reaches 0% not as it approaches 0%. It does not say “I’m running out of space. Let me create more before it’s all gone”. What it says is “Oh darn! I ran out of space. Let me take over the system and create some more.”. Unfortunately, it does effectively “take over” the system. While the files are being expanded the system (SQL) will not respond to users. To the users, NAV (SQL) will basically stop responding until the expansion is complete. How long depends on the server’s speed and how much expansion must be done. With larger databases it will get longer. Very large databases can effectively stop responding for several minutes. This is why you don’t depend on Auto-Grow. You want to manage the data size so Auto-Grow never runs.
A good rule-of-thumb is to maintain free space of between 20% and 40% of total database size. When the free space drops below 20% expand the data file (*.ndf) to bring the free space to 40%. Repeat the process when it drops below 20%.
Much like the data files, the transaction log size must be managed so that Auto-Grow does not run. The transaction log should be large enough to hold all transactions between backups without expanding. You should keep track of the file size to determine if it is auto-growing. If you find it is auto-growing there are two possible approaches.
• Increase the frequency of the transaction log backups. This helps to reduce the file space required.
• Expand the file using steps similar to above except increase the transaction log file size.
Database information in our Navision client is now reporting 100% usage on re-mount.
Our SQL server fell over when one of our partitions ran out of space where our data files were stored. The data file (*.ndf) still remains on auto-grow. I have since moved some of our other database files off to another partition to free up more space for Navision, which let me re-mount our database.
I checked the size of the data file 2 days ago and it was at 65GB, but it has grown hugely over a minimum period....?
it looks like you have serious issues with your database. You need to seriously consider getting a Navision/SQL expert in to look at it. Otherwise you could be heading for a disaster.
It's all well and good to "have a play around" and learn exciting new stuff, but not in a production environment.
It's your system and your call, but IMHO its time to get out the cheque book on this one.
By what percentage do you have it set to autogrow? it doesn't happen to be between 25 and 30%, does it?
NAV/SQL Performance Optimization & Troubleshooting
STRYK System Improvement
The Blog - The Book - The Tool