Dataport Import Performance

jesterchrisjesterchris Member Posts: 21
I'm trying to import 100K records into the Contact table 5050 and the performance is very poor. No calculations or anything simply name, address, etc... 40K records takes roughly 4 hours. I have tried increasing cache settings but no big difference. Any way to speed up the load time? I'm thinking I'm just missing some magic setting. It can't possibly take this long for real. I have literally loaded billions of records in that kind of time with Informix.

Comments

  • Alex_ChowAlex_Chow Member Posts: 5,063
    The easiest way to speed up the import process is to break up the files into smaller files.

    With any system, importing a large file will take a long time.
  • jesterchrisjesterchris Member Posts: 21
    Thanx for the reply, Do you know how to point the dataport to multiple files? I'm pretty sure I read that someone did that, I think it was Luc Van Dyck. Possibly as simple as a delimiter between file names in the dataport FileName property?
  • Luc_VanDyckLuc_VanDyck Member, Moderator, Administrator Posts: 3,633
    Do I hear my name ;-)

    You are probably referring to this (very old) thread: Import ASCII-file 150.000 lines (50 MB). I'll see if I can find the code in some database.
    No support using PM or e-mail - Please use this forum. BC TechDays 2024: 13 & 14 June 2024, Antwerp (Belgium)
  • ara3nara3n Member Posts: 9,257
    you could also run the client on the server when you upload. It will run a lot faster. Also try not to validate and move the code to your dataport. Is this sql or Native? Also put commits after 20K records.
    Ahmed Rashed Amini
    Independent Consultant/Developer


    blog: https://dynamicsuser.net/nav/b/ara3n
  • jesterchrisjesterchris Member Posts: 21
    Thanx Luc!

    Actually I tried on Both Native and SQL. In production it will be SQL.
    I don't have any code in the dataport. I'm migrating this data from KMA. I will try running the client on the server too. Still pretty new to Navision, lots to learn. Thanx.
  • zeninolegzeninoleg Member Posts: 236
    I am very curious how you have managed to break file into small files. I am facing import of 3 000 000 records of history transactions through Item Journal. It is going to be fun. :roll:
    The problem that we had is thta even though we use 50 000 line files for some reason dataport inserts less then that(stops somewhere) and says that import is complete and successful. however I know that only first x amount of records got imported.... and this "x" amount is different every time!!!!! :evil:
    So i have converted this dataport into the report. looks like it works. But the ideal situation is to create maaaaaany smal files. Is there a utility to do that?
    Best Regards,
    Oleg
  • ara3nara3n Member Posts: 9,257
    The dataport thing with existing that it is completed, I've experienced that as well. I think it happens if there are special characters in file. Try to open the file in notepad and save it again.

    There is hjsplit, it splits based on size. So what you have to do afterwards is to check and make sure that it didn't split the last line in two. Otherwise it works fine.
    Ahmed Rashed Amini
    Independent Consultant/Developer


    blog: https://dynamicsuser.net/nav/b/ara3n
  • jesterchrisjesterchris Member Posts: 21
    I have UNIX experience so I would use split in UNIX. You can get UNIX tools for window that seem to work pretty good. I feel your pain regarding wierd behavior dealing with flat files in Windoz.
  • lubostlubost Member Posts: 632
    My experiences with importing large files:

    1. Splitting large file into number of smaller - true bard sometimes hard to do it
    2. Make a COMMIT after a 1K (10K...) records - highly recomended
    3. Make appropriate free space in database.
    4. Invoke only neccessary VALIDATE (ON INSERT) triggers.
    5. Import records into some new empty table and then move them into right table
    6. Make sure that client and imported file is on the same computer.
  • ObiWanObiWan Member Posts: 21
    Hi, have u considered using DTS in MS-SQL server ?
    It should work fine (and FAST) if you don``t have any sumindex fields in the table. I dont`t beleive Navision is able to calculate those fields if there are any.


    Best regards ObiWan :mrgreen:
  • Luc_VanDyckLuc_VanDyck Member, Moderator, Administrator Posts: 3,633
    I have found some code snippets:

    1. The most important part is that you split up the large text-file into smaller files. But you must be sure to split the file at the end of a line, and not in the middle of a line. Because I didn't find a program at that time to split up files not into equal chunks of eg. 1 MB, but into chunks of eg. 1000 lines, I wrote something in Visual Basic 6:
    Option Explicit
    Private Sub Command1_Click()
      Const intNoLines = 10000
      
      Dim File1, File2, FileNamepart1, FileNamepart2, TextLine As String
      Dim i, intFileNumber As Integer
      Dim dblCounter, dblCounter2 As Double
      
      File1 = "c:\temp\items.txt"
      File2 = ""
      
      For i = Len(File1) To 1 Step -1
        If Mid$(File1, i, 1) = "\" Then
          File2 = File2 + Mid$(File1, i + 1, 999)
        Else
          i = 0
        End If
      Next i
      If File2 = "" Then
        File2 = File1
      End If
      
      FileNamepart1 = ""
      FileNamepart2 = ""
      For i = 1 To Len(File2)
        If Mid$(File2, i, 1) = "." Then
          FileNamepart1 = Mid$(File2, 1, i - 1)
          FileNamepart2 = Mid$(File2, i, 999)
          i = 999
        End If
      Next i
      
      intFileNumber = 0
      dblCounter = 0
      dblCounter2 = 0
      
      Open File1 For Input As #1
      Do While Not EOF(1)
        Line Input #1, TextLine
        dblCounter = dblCounter + 1
        dblCounter2 = dblCounter2 + 1
    
        If (dblCounter2 > intNoLines) Or (dblCounter = 1) Then
          intFileNumber = intFileNumber + 1
          dblCounter2 = 1
          Close #2
          File2 = FileNamepart1 & "_" & intFileNumber & FileNamepart2
          Open File2 For Output As #2
        End If
        Print #2, TextLine
      Loop
      Close #1
      Close #2
    
    End Sub
    
    This VB routine will generate files with names items_1.txt, items_2.txt etc.

    2. This dataport will start with the first file items_1.txt and after the import, it will check if another file with the next number exists. If it can find such file, it will run itself and continue with this one. And then up to the next one. etc.
    OBJECT Dataport 50000 Import Items
    {
      OBJECT-PROPERTIES
      {
        Date=01/08/06;
        Time=[ 9:08:31];
        Modified=Yes;
        Version List=;
      }
      PROPERTIES
      {
        Import=Yes;
        UseReqForm=No;
        OnInitDataport=BEGIN
                         IF txtFileName = '' THEN
                           txtFileName := 'c:\temp\items_1.txt';
                       END;
    
        OnPreDataport=BEGIN
                        CurrDataport.FILENAME := txtFileName;
                      END;
    
        OnPostDataport=VAR
                         ltxtFilepart1@1000000006 : Text[250];
                         ltxtFilepart2@1000000002 : Text[250];
                         lintPos1@1000000007 : Integer;
                         lintPos2@1000000008 : Integer;
                         lintFileNumber@1000000001 : Integer;
                         ldtpImport@1000000000 : Dataport 50000;
                       BEGIN
                         WHILE TRUE DO BEGIN
                           COMMIT;
    
                           lintPos1 := STRPOS(txtFileName,'_');
                           lintPos2 := STRPOS(txtFileName,'.');
                           ltxtFilepart1 := COPYSTR(txtFileName,1,lintPos1);
                           ltxtFilepart2 := COPYSTR(txtFileName,lintPos2);
                           EVALUATE(lintFileNumber,COPYSTR(txtFileName,lintPos1 + 1,lintPos2 - lintPos1));
                           lintFileNumber := lintFileNumber + 1;
                           txtFileName := ltxtFilepart1 + FORMAT(lintFileNumber) + ltxtFilepart2;
                           IF EXISTS(txtFileName) THEN BEGIN
                             ldtpImport.fctSetFileName(txtFileName);
                             ldtpImport.FILENAME(txtFileName);
                             ldtpImport.RUN;
                           END;
    
                           EXIT;
    
                         END;
                       END;
    
      }
      DATAITEMS
      {
        { PROPERTIES
          {
            DataItemTable=Table27;
          }
          FIELDS
          {
          }
           }
      }
      REQUESTFORM
      {
        PROPERTIES
        {
          Width=9020;
          Height=3410;
        }
        CONTROLS
        {
        }
      }
      CODE
      {
        VAR
          txtFileName@1000000000 : Text[250];
    
        PROCEDURE fctSetFileName@1000000000(ptxtFileName@1000000000 : Text[250]);
        BEGIN
          //fctSetFileName
    
          txtFileName := ptxtFileName;
        END;
    
        BEGIN
        END.
      }
    }
    
    
    This is not a complete dataport, but it has the overall structure. Important is the code inside the OnInitDataPort-, OnPreDataPort- and OnPostDataPort-triggers.

    Hope this is somewhat useful.
    No support using PM or e-mail - Please use this forum. BC TechDays 2024: 13 & 14 June 2024, Antwerp (Belgium)
  • ameramer Member Posts: 22
    Be careful with DTS when importing ledgers!
    See
    http://www.mibuso.com/forum/viewtopic.php?t=9615
  • krikikriki Member, Moderator Posts: 9,118
    amer wrote:
    Be careful with DTS when importing ledgers!
    See
    http://www.mibuso.com/forum/viewtopic.php?t=9615
    NEVER do this directly into the ledger entry-tables, but put them in the journal-tables and post it afterwards.

    To import faster: disable all secondary keys and/or flowfields you don't need!
    After the import, enable them again.
    Regards,Alain Krikilion
    No PM,please use the forum. || May the <SOLVED>-attribute be in your title!


  • ameramer Member Posts: 22
    I agree absolutely.
    (Except when you need to recover the situation when one of the developers accidentally deletes a couple of GL ledger entries :D )
    Regards,
    Amer
  • Miklos_HollenderMiklos_Hollender Member Posts: 1,598
    I am not so sure splitting up is really needed - I imported about 100 000 Items and Variants together in about 20 minutes. It was on SQL Server, so maybe you could use that, and I was using the method described here: http://www.mibuso.com/dlinfo.asp?FileID=579
  • Alex_ChowAlex_Chow Member Posts: 5,063
    If you're only importing 2 fields, it'll be okay.

    The performance slows down dramtically if you have additional fields to import.
Sign In or Register to comment.