slow dataport

junkmailtrapjunkmailtrap Member Posts: 10
I am not particularly familiar with Navision, but I've been pulled into a project where they are having issues with the speed of a dataport.

The file being imported has 19,000 records. When run from the client to the server it takes somewhere around 3-4 hours. When the client is run on the server itself it takes about 30-40 mintues. The vendor who is implementing it assures me that the problem is not with Navision or with the dataport code but rather with the environment (either the OS or some hardware issue.) As a test I set up the server and client on two new clean test machines connected to a dedicated switch, completely isolated from the rest of the network. The speed of the import remained unchanged. However the vendor again assures me that in his test environment it runs in 40 minutes and he "didn't do anything special" to set up his environment. At this point the vendor is suggesting we move the database to SQL Server and hope it runs faster there. The plan was to move it to SQL Server anyway so that's not a big deal, but I have my doubts that it'll improve all that much without redesigning the import.

So anyway, is this type of performance normal for Navision and an import of this size?
What can be done to improve the performance?
How much is moving to SQL Server likely to help?
Given that there is no indication that there are any resource utilization problems, is the problem more likely to be with the server environment, with navision itself, with the dataport design, or with something else entirely?

Any insight into this would be wonderful.

Thanks,

Oh... It appears that we're on Navision 5.00 so I hope this is the right forum.

Comments

  • SavatageSavatage Member Posts: 7,142
    The quickest & easiest solution is to break up your 1->19,000 line file into 19-> 1,000 line files.

    Or use COMMIT on every 1000 records.

    The bigger the file the slower the dataport gets, it just how it is.
  • FDickschatFDickschat Member Posts: 380
    Dataports were never known for their lightning speed. As Savatage said, the bigger the file (this also means the longer the records) the slower a dataport gets.

    If a single line in this file will not exceed 1024 characters it could be easily imported using a report and manual programming. If it's longer than 1024 this can also be done but is a little more work. Creating such a report will take ~1/2 - 1 day depending on the number of fields in a line. Such a solution will be up to 20 times faster than a dataport (but is also 10 times more expensive).

    If this Interface is to be used more than once I would always go for the report. Otherwise a dataport is just so much cheaper that I would not care about the time it takes.
    Frank Dickschat
    FD Consulting
  • David_SingletonDavid_Singleton Member Posts: 5,479
    I agree with harry and Frank's comments, but let me also add some.

    The issue if Server vs Client has always been an issue, that Navision/PCC/Micorosft have never addressed. Soon dataports will be gone anyway, so don;t expect any fixes.

    In the early days of Navision one of the biggest issues, was that every line read, the pointer went back to the start, and counted to the next line. so as the lines got more, the pointer had to cont exponentially. Thus a dataport with 2x the lines was 4 x "slower" if you had 10 times more lines, it was 100 times slower. This issue was fixed, yet still we see this exponential slow down. So Harry's solution is often the best. Since 10 times the files takes 10 times as long, but 10 times the LINEs take 100 times as long.

    The thing is that this affects both running on the server and running on the client, BUT running on the same box is always much faster, again no idea why. If you look at the network traffic, there just is not enough in 20,000 lines to just justify this difference, yet we see it every time. Of course running a posting routine we expect to be faster on the server because of the data volumes, but dataports shouldn't be faster but are.

    In simple terms you are correct, this is a Navision issue, not a hardware issue. Converting to SQL is unlikely to have any improvement.
    David Singleton
  • junkmailtrapjunkmailtrap Member Posts: 10
    Thanks everyone!

    In the dataport, how do I call commit after 1000 records? I did some searching but didn't find anything on how to determine how far along I am in the import.
  • David_SingletonDavid_Singleton Member Posts: 5,479
    Thanks everyone!

    In the dataport, how do I call commit after 1000 records? I did some searching but didn't find anything on how to determine how far along I am in the import.

    This will only help if the issue is in the code that you are posting, or if the server is not powerful enough.

    The fact that the performance difference is so great between server and client, means you probably wont get a big performance boost.

    Try splitting the file into 10 smaller ones. Just to see what results you get time wise.

    Remember the issue could be any of 20 different things, you have given us close to zero information, so we are just guessing right now. You have to start analyzing this properly.
    David Singleton
  • rhpntrhpnt Member Posts: 688
    I would boldly guess that this dataport incorporates many fields, validations, finding other records in other words tons of code. This is always the main cause for performance issues. 19.000 lines is not a lot and a more or less code free dataport will consume that in a few minutes.

    Try to import the data with a new clean dataport into a separate clean table and see what happens.
  • David_SingletonDavid_Singleton Member Posts: 5,479
    edited 2010-12-15
    rhpnt wrote:
    I would boldly guess that this dataport incorporates many fields, validations, finding other records in other words tons of code. This is always the main cause for performance issues. 19.000 lines is not a lot and a more or less code free dataport will consume that in a few minutes.


    Not true.
    David Singleton
  • David_SingletonDavid_Singleton Member Posts: 5,479
    Another important missing factor, is the version you are running on.
    David Singleton
  • David_SingletonDavid_Singleton Member Posts: 5,479
    rhpnt wrote:

    Try to import the data with a new clean dataport into a separate clean table and see what happens.

    This is good advice.
    David Singleton
  • danlindstromdanlindstrom Member Posts: 130
    Another important missing factor, is the version you are running on.

    junkmailtrap said that in the first post:
    Oh... It appears that we're on Navision 5.00 so I hope this is the right forum.
    :whistle:
    Regards
    Dan Lindström
    NCSD Navision 2.00 since 1999 (Navision Certified Solution Developer)
    MBSP Developer for Microsoft Dynamics NAV 2009
  • junkmailtrapjunkmailtrap Member Posts: 10
    I ran the import with 1000 records and with 2000 records and it works out to 258 records/minute as opposed to 95 records/minute when I run it with the 19,000 records.

    Since this import is going to be run daily, splitting the files isn't really a good option.

    The dataport is doing quite a bit of stuff in it, matching records and the like, but unfortunately I don't have a good handle on that.
  • David_SingletonDavid_Singleton Member Posts: 5,479
    I ran the import with 1000 records and with 2000 records and it works out to 258 records/minute as opposed to 95 records/minute when I run it with the 19,000 records.

    Since this import is going to be run daily, splitting the files isn't really a good option.

    The dataport is doing quite a bit of stuff in it, matching records and the like, but unfortunately I don't have a good handle on that.


    The thing is that you have now identified that the number of records in the file is very significant. i.e. without touching the code its almost 3 times faster. So it means that you have narrowed down a core component of the performance issue.

    Anyway, it will really help to know which version you are running.

    Follow these instructions,

    http://wiki.dynamicsbook.com/index.php?title=Build_No

    there are three version numbers that are needed
    1/ The EXE version number the number in Parenthesis
    2/ The Object version number the first number before the ()
    3/ The build number.
    David Singleton
  • David_SingletonDavid_Singleton Member Posts: 5,479
    Another important missing factor, is the version you are running on.

    junkmailtrap said that in the first post:
    Oh... It appears that we're on Navision 5.00 so I hope this is the right forum.
    :whistle:

    Not enough information.
    David Singleton
  • rhpntrhpnt Member Posts: 688
    rhpnt wrote:
    I would boldly guess that this dataport incorporates many fields, validations, finding other records in other words tons of code. This is always the main cause for performance issues. 19.000 lines is not a lot and a more or less code free dataport will consume that in a few minutes.


    Not true.

    Not so long ago I imported approx. 30.000 lines into the item table (one file, 10 fields with validation). True, I didn't measure the time but I'm sure I didn't sat there for 30 or 40 minutes. And true, it was a SQL server DB.
  • David_SingletonDavid_Singleton Member Posts: 5,479
    rhpnt wrote:
    rhpnt wrote:
    I would boldly guess that this dataport incorporates many fields, validations, finding other records in other words tons of code. This is always the main cause for performance issues. 19.000 lines is not a lot and a more or less code free dataport will consume that in a few minutes.


    Not true.

    Not so long ago I imported approx. 30.000 lines into the item table (one file, 10 fields with validation). True, I didn't measure the time but I'm sure I didn't sat there for 30 or 40 minutes. And true, it was a SQL server DB.

    Once upon a time I saw a guy drive his car through a red light he did not hit another car and did not get stopped by the police. Does this then mean that its now 100% safe for all drivers to drive through red lights?

    :shock:
    David Singleton
  • junkmailtrapjunkmailtrap Member Posts: 10
    Version NA Dynamics NAV 5.0 (5.00) Build 24199
  • David_SingletonDavid_Singleton Member Posts: 5,479
    Version NA Dynamics NAV 5.0 (5.00) Build 24199


    OK so that's probably not an issue then.

    What table are you importing to?
    David Singleton
  • junkmailtrapjunkmailtrap Member Posts: 10
    What table are you importing to?

    That I'm not sure. I had never even seen a dataport before last week, so I'm still making sense of it all.

    These are deposits if that helps. And from what I can tell in the code, it looks like it is making journal entries.
  • David_SingletonDavid_Singleton Member Posts: 5,479
    What table are you importing to?

    That I'm not sure. I had never even seen a dataport before last week, so I'm still making sense of it all.

    These are deposits if that helps. And from what I can tell in the code, it looks like it is making journal entries.

    In that case, I think you can say you have done your job, and its time to hand it back to the partner and have them look at it.

    My guess is that its a combination of a number of things that they will need to eliminate one at a time.
    David Singleton
  • junkmailtrapjunkmailtrap Member Posts: 10
    In that case, I think you can say you have done your job, and its time to hand it back to the partner and have them look at it.

    My guess is that its a combination of a number of things that they will need to eliminate one at a time.

    Yes, I agree and have been pushing for that since the beginning. The partner is absolutely convinced that the problem does not lie in how he wrote the import script, and of course since he's "not a hardware guy" he can't point me to where else the problem might be. So that leaves me trying to find out as much as I can so I can point him towards a solution.

    Thanks for all the information.
  • SavatageSavatage Member Posts: 7,142
    Since this import is going to be run daily.

    Will it be doing 19,000 everyday? or is this large file due to the start of this project and regular new days will have smaller files?

    Else your talking 5-6 million plus journal entries a year :-k
  • junkmailtrapjunkmailtrap Member Posts: 10
    19,000, is likely close to the upper limit. It'll vary, but over the next month or so, they'll probably average around 10,000 a day. and then for the rest of the year, they'll be much smaller.

    As the work around for now I've got them doing it on the server where it doesn't take as long, but I'd like to get it right so we don't go through this again next year and also so they'll stop asking me if we just need to buy a bigger server.
  • junkmailtrapjunkmailtrap Member Posts: 10
    Is there a relatively easy way to see where the Dataport is spending all it's time?
  • David_SingletonDavid_Singleton Member Posts: 5,479
    Is there a relatively easy way to see where the Dataport is spending all it's time?

    This is what the partner is for.
    David Singleton
  • junkmailtrapjunkmailtrap Member Posts: 10
    HaHaHa!
  • rhpntrhpnt Member Posts: 688
    Once upon a time I saw a guy drive his car through a red light he did not hit another car and did not get stopped by the police.
    :shock:

    Wow, this guy sure was an expert driver...
  • FDickschatFDickschat Member Posts: 380
    Is there a relatively easy way to see where the Dataport is spending all it's time?

    This is what the partner is for.
    HaHaHa!
    :shock: Junkmail, it sounds like you are not really satisfied with your Partner and don't trust them to develop the best possible solution for you. David is absolutely correct, that is the job of your partner. There are tools existing with which they can find out why it is so slow.

    If this is a daily job and the files usually have around 10000 lines I personally would have programmed the import manually as dataports are so damn slow (as you have seen when splitting the file). But this is all just wild guessing and looking into a crystal ball.

    If you can't bring your partner to check on this issue (which he should do) you could bring in a Freelancer to evaluate the code inside the dataport, to check why performance is that slow and to check what other options there are to speed up the import (which will cost additional money of course and your partner will probably not like it).
    Frank Dickschat
    FD Consulting
Sign In or Register to comment.