Dataport Very Slow
-matrix-
Member Posts: 103
Hi Everybody,
I'm Trying to Import very Large TXT File (About One Milion of Lines) with Dataport.
It's too Slow the process , do you know any way or Trick for Increase The performance?
Thanks In Advance.
I'm Trying to Import very Large TXT File (About One Milion of Lines) with Dataport.
It's too Slow the process , do you know any way or Trick for Increase The performance?
Thanks In Advance.
0
Comments
-
Are you importing it to a Temporary table? or are you making lookups on other tables during the running of dataport?0
-
Hi stiasta,
It's Isn't a Temporary Table, It's Custom Table with code under OnValidate Triggers (That I Think Makes It Slower).. :-k0 -
Aright. Make sure you make the proper keys when you make modifications and/or lookups. (SETCURRENTKEY())
If a inproper key is chosen it would dramatically reduce the efficiency0 -
You first need to determine if the issue is actually the import (1 million lines will take a long time, so you need to define slow) or if the issue is your code.
Step one is to create a dummy test of the import with no code validation and time it and compare to normal.David Singleton0 -
Dataports on SQL are slow.
Try a SQL bulk insert.0 -
Dataports are by a factor of 20 slower compared to importing files with a simple File.Read (with the limitation of 1024 characters per line - which can be overcome) or using an instream.
I use dataports out of the box only for smaller files. Everything else will be coded.Frank Dickschat
FD Consulting0 -
I guess this comes down to "giving fish compared to teaching to fish" I used a dataport to export a customer list for a client 3 months ago, that was the only data port I have used in about the last 4 years, I simply don't use them. For me option 1 is Direct SQL, option 2 XML option 3 code. But I think its better to let the OP try for him self and learn that way. otherwise he will always have that lingering doubt that maybe it was the validation code.David Singleton0
-
If you are using SQL Server, then you are inserting one million rows, then doing a commit, unless you are inserting via code and writing your own commits.
Sounds like a bad idea to me.David Machanick
http://mibuso.com/blogs/davidmachanick/0 -
Divide the large txt-file into smaller files and process them one after the other. This will be faster then importing the one large file at once.-matrix- wrote:It's too Slow the process , do you know any way or Trick for Increase The performance?No support using PM or e-mail - Please use this forum. BC TechDays 2024: 13 & 14 June 2024, Antwerp (Belgium)0
Categories
- All Categories
- 73 General
- 73 Announcements
- 66.7K Microsoft Dynamics NAV
- 18.7K NAV Three Tier
- 38.4K NAV/Navision Classic Client
- 3.6K Navision Attain
- 2.4K Navision Financials
- 116 Navision DOS
- 851 Navision e-Commerce
- 1K NAV Tips & Tricks
- 772 NAV Dutch speaking only
- 617 NAV Courses, Exams & Certification
- 2K Microsoft Dynamics-Other
- 1.5K Dynamics AX
- 324 Dynamics CRM
- 111 Dynamics GP
- 10 Dynamics SL
- 1.5K Other
- 990 SQL General
- 383 SQL Performance
- 34 SQL Tips & Tricks
- 35 Design Patterns (General & Best Practices)
- 1 Architectural Patterns
- 10 Design Patterns
- 5 Implementation Patterns
- 53 3rd Party Products, Services & Events
- 1.6K General
- 1.1K General Chat
- 1.6K Website
- 83 Testing
- 1.2K Download section
- 23 How Tos section
- 252 Feedback
- 12 NAV TechDays 2013 Sessions
- 13 NAV TechDays 2012 Sessions


