Optimise cache and database size

shogan@mila.ieshogan@mila.ie Member Posts: 113
edited 2004-05-08 in Navision Attain
Hello all,

Over the past year, there has been a general slowdown on Attain Clients on all PC's. We use Attain 3.01A.

The clients are a mix of Win98, 98SE and 2000 SP4 machines, 300 - 450MHz and 128MB RAM, with IE6SP1 and all windows updates installed.

Object and DBMS cache = 8000

The server:
IBM NetFinity x230 Series e-server
Windows 2000 SP2
512MB RAM
RAID 1
Swap file: 768 - 1536

Currently the Database used is 5899312/7500000 (79%) - I try to keep it from going over 85% by expanding.
DBMS Cache = 332800
Object Cache = 8000

All NICs are 10/100 3Com 3C90X, and the switch is a 3Com 10/100 SuperStack 3300.

Typically no. of connections would range from 16 - 26, a lot of clients running two sessions of Attain at the same time (as we have two companies).

The database is spread out over three hard disks as follows:
D: 500000
E: 500000
F: 6500000

All drives, including C: are NTFS formatted. McAfee NetShield 4.5 SP1 and other updates are also installed. Bar a few other utilities (RAID Manager, Atomic TimeSync, Veritas 8.6), nothing else really is critical on this server.

It appears that drive F: is constantly expanding and not increasing the other files proportionately, so most of the activity is happening here, I believe. I was told that if any of the files go over 4GB, then the database is not 'efficient'.

What I want to do is try to optimise the database and caches on both the server and clients. I believe that there is some formula that allows me to select the cache sizes on the server and the clients, based on their hardware and memory configurations.

Anyone have any ideas on how I can do this? How can I displace more work on the other drives by expanding those files? I just want to be informed.

Regards
Stephen

Comments

  • SavatageSavatage Member Posts: 7,142
    Scenarios:

    1. Creating a new database and restoring data.
    2. Expanding existing parts of a database
    3. Expanding using existing and newly created parts of a database
    4. Fixing a 'non optimized' database.

    1. Creating a new database and restoring data

    If you create a new database that is spread over several disks, you must close and reopen the database before restoring data. This will recover the 'List of Free Blocks' and the restored data will be equally spread over all disks, thus securing the optimal performance.

    To spread the data equally over the new database, you must follow these steps:

    Create a new database and expand it with database parts of the same size
    Close and reopen the database (rebuild 'List of Free Blocks')
    Restore the backup.

    2. Expanding existing parts of a database

    If you expand existing parts of a database that is spread over several disks, you must close and reopen the database before using it again. This will update the 'List of Free Blocks' and secure the optimal performance.

    Expand all database parts with the same size
    Close and reopen the database (rebuild 'List of Free Blocks').

    3. Expanding using existing and newly created parts of a database

    If you expand a database by one or more new database parts on new disk(s), you should follow the correct procedure (D). If you haven't done so, you must at least close and reopen the database so that Navision recognizes the new disk(s) and starts using it . If you don't close/reopen, it will continue to use the old 'List of Free Blocks' and will take some time before the list wraps around and starts using the new disk(s). The database is now 'non optimized', because data is not spread equally over the disks. Procedure (D) tells you how to fully optimize the database.

    4. Fixing a 'non optimized' database

    To spread the data equally over all parts of a database, follow these steps:

    a. Make a Navision backup
    b. Delete the database
    c. Create a new database with the required parts of the old database
    d. Close and reopen the database (rebuild 'List of Free Blocks')
    e. Restore the backup.

    This procedure is the optimal way of adding a new database part, but it is also the most time consuming.
  • SavatageSavatage Member Posts: 7,142
    I'm sure this Techknowledge has been posted before:
    Database files must be the SAME SIZE on all disks. For example, if a 2.1 GB database is placed over 3 disks, use 3 * 700MB parts. If the same database is expanded to 2.4 GB, expand 100MB per partition, making it 3 * 800MB parts.

    If you change the number of disks (database parts) you MUST do the following:
    1. Make a backup.
    2. Delete the database.
    3. Create a new database with the same database file parts sizes.
    4. Restore the backup.

    D. Allocate all available memory to the Navision Server cache. Use commitcache to speed up insert transactions.
    1. The installation program allocates approximately 2/3 of physical memory to the server cache. You must change the server parameter CACHE.
    2. The installation program does not activate the commitcache. You must change the server parameter COMMITCACHE.
    3. If you activate commitcache, make sure that you use UPS to back up power failures (you may lose transactions from commitcache that have not been flushed to the disks).

    Memory is a way to decrease the harddisks' bottleneck.
    1. Use as much RAM as possible. Generally, use at least 4 - 8 MB of memory per user for cache. Plan for approximately 200MB cache for a 30 user system (256MB system RAM at least) or more, because memory is rather inexpensive.
    2. The maximum Navision Server cache is 1GB. Therefore, there is no advantage to purchasing more than 2048MB of RAM, leaving 1GB for Windows and 1GB for Navision.
    3. MAKE SURE that the computer is not swapping, for example, after you increase the cache size.

    E. Use a DEDICATED Navision server that is a stand-alone server (not PDC or BDC). If you have a non-dedicated Navision server computer, make sure that the programs are not competing for resources. NEVER run SQL server or Exchange server on the same computer with Navision server.

    F. Use a single processor computer. Allow Windows NT to use processor cache fully.

    G. See TechKnowledge 33353 for information on low bandwidth constraints.

    Note Allowing low bandwidth connections for some users can impose the risk of very bad performance, but if you must do this, do NOT allow those users to modify/insert records. If a low bandwidth client processes data, tables may be locked for a longer period of time, locking every other user and slowing down the whole system.
  • RobertMoRobertMo Member Posts: 484
    Note Allowing low bandwidth connections for some users can impose the risk of very bad performance, but if you must do this, do NOT allow those users to modify/insert records. If a low bandwidth client processes data, tables may be locked for a longer period of time, locking every other user and slowing down the whole system.

    Slow clients (slow connection and/or slow PC) are in many cases the reason for bad performance of the whole system.
               ®obi           
    ¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯
Sign In or Register to comment.