I'm Trying to Import very Large TXT File (About One Milion of Lines) with Dataport.
It's too Slow the process , do you know any way or Trick for Increase The performance?
You first need to determine if the issue is actually the import (1 million lines will take a long time, so you need to define slow) or if the issue is your code.
Step one is to create a dummy test of the import with no code validation and time it and compare to normal.
Dataports are by a factor of 20 slower compared to importing files with a simple File.Read (with the limitation of 1024 characters per line - which can be overcome) or using an instream.
I use dataports out of the box only for smaller files. Everything else will be coded.
I guess this comes down to "giving fish compared to teaching to fish" I used a dataport to export a customer list for a client 3 months ago, that was the only data port I have used in about the last 4 years, I simply don't use them. For me option 1 is Direct SQL, option 2 XML option 3 code. But I think its better to let the OP try for him self and learn that way. otherwise he will always have that lingering doubt that maybe it was the validation code.
If you are using SQL Server, then you are inserting one million rows, then doing a commit, unless you are inserting via code and writing your own commits.
Sounds like a bad idea to me.
Comments
It's Isn't a Temporary Table, It's Custom Table with code under OnValidate Triggers (That I Think Makes It Slower).. :-k
If a inproper key is chosen it would dramatically reduce the efficiency
Step one is to create a dummy test of the import with no code validation and time it and compare to normal.
Try a SQL bulk insert.
I use dataports out of the box only for smaller files. Everything else will be coded.
FD Consulting
Sounds like a bad idea to me.
http://mibuso.com/blogs/davidmachanick/