Database size problem

patrickTpatrickT Member Posts: 2
Dear All,

I'm a new informatic support for navision.

I my company, we use navision 4 on this server:

Windows 2003 Server Entreprise Edition
SQL Server 2000 Entreprise Edition SP3
Ram 8 GO.
User: 122

The db size are 156GO. I see that the maximum size is 128 or 256GO in a other topic. The db grow up 4 Go a month. I cannot lose information detail.

1) Did i have way to reduce the database without lose information detail?

2) Another problem is when a people launch a multi-project repport, Navision become very slow. This problem is because they are many information in the db. Did you have some solution?

3) It- is possible to have a live copy database in a other server? The other server can do only repport

Sorry for my english.

Thanks.
Thanks
Patrick Turcotte

Comments

  • krikikriki Member, Moderator Posts: 9,118
    patrickT wrote:
    1) Did i have way to reduce the database without lose information detail?
    Some possibilities:
    -reprogram some things (specially customizations), so you can remove some indexes in the BIG tables.
    -Optimize the tables. This can have the negative that creating new records in it becomes slower (e.g. all the Entry tables)
    -make a new table (with less fields,if you can affort to lose some fields; with less/smaller indexes) and move records from the big tables to that table. Your reports can also read these tables.
    patrickT wrote:
    2) Another problem is when a people launch a multi-project repport, Navision become very slow. This problem is because they are many information in the db. Did you have some solution?
    Run those reports when no one is working, e.g. during the night
    Those reports write a lot of information? In this case you can add some code to avoid that the server is overloaded with writes from the report. see http://www.mibuso.com/forum/viewtopic.php?t=6945
    patrickT wrote:
    3) It- is possible to have a live copy database in a other server? The other server can do only repport
    Yes, you can use the HOTCOPY to make a backup.


    Another possibility is to consider SQL. But you need a bigger server. You have to review the indexes in the tables (a lot of them you don't need to maintain and a lot of them you can create new but smaller and not maintain the old one)
    Regards,Alain Krikilion
    No PM,please use the forum. || May the <SOLVED>-attribute be in your title!


  • KowaKowa Member Posts: 925
    1. Only by disabling those keys which are not absolutely necessary. It depends on your specific demands and customizing if this is possible and requires a long investigation. The code has to be checked for all possible "Setcurrentkey" commands in every object ( table, form, report,Codeunit etc.) before a key can be disabled. The tables can then be optimized to reduce their size further, but this will have to repeated regulary to keep the optimization level high.

    2. Faster Hardware :), not only the server, clients also. Use the fastest client hardware available to create the reports.

    3. You can use Hotcopy (a backup program) to make a copy of a live database onto another server. It is not absolutely live, but close. Hotcopy uses immense resources, still it can take some hours on large databases depending on the network, and the reports created from the database copy will of course only show the data up to the point to when the backup was started. The fastest way is to stop the service for the navision server and to copy the database parts directly onto another server.
    Kai Kowalewski
  • kinekine Member Posts: 12,562
    Hotcopy - only for Native DB, not MS SQL... ;-)

    Running OPTIMIZE function on some tables can save space. For example for "work" tables like sales lines etc. where SIFT tables are existing. It will delete buckets with 0 (and after some time there is lot of records with 0).

    Optimizing indexes (keys) is base for saving space. You need only few maintained indexes with good selectivity, other keys you need only for sorting in Navision and you do not need keep them on SQL. Optimizing SIFT levels save the space too... You need keep only some levels you are using, and only once (in some cases there are same fields in some level as on another key only in another sequence - you need to keep it only once).

    And of course, dimensions are space-killers...
    Kamil Sacek
    MVP - Dynamics NAV
    My BLOG
    NAVERTICA a.s.
  • chen_wanchen_wan Member Posts: 34
    To reduce the database size, you can shrink the database from sql server. Navision tends to increase to log file to very big. After shrink the database, make sure you increase the log file size to certain size, depends on how heavy is your transaction. This is very important because if your log file is too small, sql server will expand the file every time you hit the size limit of the file. This is a very slow process. Make it short, you can shrink the log file to reduce the database size but you cannot make it too small.

    If you want to make a copy of the database to another server for reporting only. You can do so with SQL Server Replication to replicate the database to another server.

    You can also do reindex in sql server to improve performance on very big table.
  • KowaKowa Member Posts: 925
    patrickT wrote:
    I see that the maximum size is 128 or 256GO in a other topic.
    After reading this, I was assuming that the native server is used on the MS SQL Hardware. If you are using Navision SQL Server, these size limits do not apply.
    Kai Kowalewski
  • Marije_BrummelMarije_Brummel Member, Moderators Design Patterns Posts: 4,262
    I just heard that the 256Gb limit is also exeeded. :mrgreen:
  • Marije_BrummelMarije_Brummel Member, Moderators Design Patterns Posts: 4,262
    Hey! I cannot edit my reply! :shock:

    The largest database I've heard of was more than 400 gigs.
  • SaroSaro Member Posts: 58
    indexed search time is O(log[n])
    non-indexed search time is O(n)

    this means if your tables are not indexed (sorted by the searched field)
    a search of a million records take a million units of time
    whereas if the tables are indexed the search time is log(1,000,000) which is about 6 units of time
    taking this into consideration you can improve the speed of your reports if you add the necessary keys to the needed tables.

    as for the size, normalization is what you need, you should check all the tables and delete duplicate and unneeded data, you can use flowfields to show customer name for example on a certain form instead of having a field dedicated to that, flowfields do not take space in the database

    i hope this helps
    Saro
  • kinekine Member Posts: 12,562
    Saro wrote:
    as for the size, normalization is what you need, you should check all the tables and delete duplicate and unneeded data, you can use flowfields to show customer name for example on a certain form instead of having a field dedicated to that, flowfields do not take space in the database

    i hope this helps

    Normalized tables are not optimal for performance. Sometime is better to have some value copied directly into another table to be able to filter it directly. And in this case, you can do FlowFields with SUMs filtering this field, if you have normalized tables, you are not able to do that and you need to get the related record, and use condition etc...

    You need to find balance between performance and size (for performance is best one table with all related data in one record, for size many tables without duplicities).
    Kamil Sacek
    MVP - Dynamics NAV
    My BLOG
    NAVERTICA a.s.
  • SaroSaro Member Posts: 58
    that would work if we had a small number of records, however, once you hit the 50 million mark, and i'm guessing a 150 GB database has a lot more than that, exponential functions (or base 10) clearly prove that the weight of the performance V/S size changes dramatically, since it's better to put a single get command if you're running through a few million records than have 10 fields extra added to that record to save the time for get. Our database hasn't even a tenth of that number of records, and i was able to reduce it's size from 4.8 gigs to 2.8 gigs and the performance got even better!!
    Saro
Sign In or Register to comment.