ImportFile.CREATEINSTREAM(IStream); WHILE NOT IStream.EOS OR (IText <> '') DO BEGIN IF NOT IStream.EOS THEN BEGIN FieldValue := IText; IStream.READTEXT(IText,1000-STRLEN(IText)); IText := FieldValue + IText; END; WHILE (STRPOS(IText,FieldSeperator) <> 0) OR ((IText <> '') AND IStream.EOS)DO BEGIN FieldValue := ''; IF STRPOS(IText,FieldSeperator) > 1 THEN FieldValue := COPYSTR(IText,1,STRPOS(IText,FieldSeperator)-1); IText := COPYSTR(IText,STRPOS(IText,FieldSeperator)+STRLEN(FieldSeperator)); FieldValue := DELCHR(FieldValue,'=',FillChar); CASE FieldCounter OF 1: EVALUATE(Table.Field1, FieldValue); 2: EVALUATE(Table.Field2,FieldValue); (... this fifty times then) END; FieldCounter += 1; IF ((STRPOS(IText,RecordSeperator) < STRPOS(IText,FieldSeperator)) AND (STRPOS(IText,RecordSeperator) <> 0))OR ((STRPOS(IText,FieldSeperator) = 0) AND (STRPOS(IText,RecordSeperator) <> 0)) THEN BEGIN FieldValue := ''; IF STRPOS(IText,RecordSeperator) > 1 THEN FieldValue := COPYSTR(IText,1,STRPOS(IText,RecordSeperator)-1); IText := COPYSTR(IText,STRPOS(IText,RecordSeperator)+STRLEN(RecordSeperator)); EVALUATE(Table.LastField,FieldValue); FieldCounter := 1; Table.INSERT; END; END; FieldValue := IText; END; ImportFile.CLOSE;
Comments
Find out if it is the absolute position of the record in the file:
remove the first record from the file, see if it skips the next record instead of the same.
But a more normalised way of importing the file would be the better choice.
|To-Increase|
@sog this is a 30MB file in one line (recordseparator is like ~+~ not newline) so of course it iis always over 1000.
@everybody another idea. I have seen many interfaces in my life and if there was either a program error or an error in data we had an error message, never a silent skipping of records. I am suspecting more of a technical error, which is harder to solve, but has to be, very important reports depend on it.
Could it happen in a native database that when importing like 80,000 records some cache like commit cache fills up and skips something? Or memory or something?
If itext exceeds 1000 at any point.You'll get odd behaviour.
What kind of behaviour, I don't know, I'm not the one who has the codeunit.
|To-Increase|
Again, my general instinct tells me if there is a program or data error in an interface there is usually an error message. Can such a silent skip be the result of something else, commit cache filling up, memory filling up and suchlike?
I assume you have some counter for each record, and you know which record is the bad one, so just add code like
and put a break point on the a := a line, the debugger will stop when recordcounter = 37999.
{edit typo no -> know}
From my point of view the only system related thing I could think of is that there's a special character within that record that leads to different system behaviour. Maybe it isn't even visible.
Do you use any condition that would skip records? E.g. if the record (or at least it's pk field values) appears twice in one file?
I would go that debugger way as well. You could do something like that: And from that point go on in single steps.
I was just about to post the exact same thing. :thumbsup:
BTW I found out it is not commit cache (now I tried committing every 10000 records, no change) and no, dataports are not an option, they will not work on NAS even if called through code, simply variables of a dataport type are not allowed.
To be honest not sure how to resolve it, the algorithm is just too complicated.
Any ideas to rewrite it in simpler ways? Recordseparators are 3 character long so reading byte by byte would not work. Try using a single-character recordseparator that hopefully does not occur in the data like § and rewrite it to read byte by byte?
No they are very very different. In fact my recommendation would be to convert to XML ports if that is some how possible.
I was also very curious how you managed to run a dataport with NAS.
BTW my problem just gets messier and messier. Basically if the 1000 characters read ends so that there is no recordseparator in it, the whole logic goes out of the window. I don't know this logic can be fixed at all, the whole thing is just wrong - what to do if there is a numeric field €50000 and the 1000 character read forward happens to end at €5000? And so on. I think this just should not be done this way but have no idea how to explained that to management, this resulted in fairly reliable reports for a half a year and they are important ones, it is basically about synching stock etc. from subsidiaries to HQ for reporting.
Does any of you see any way of fixing it?
If not what would be the safest way to rewrite it? Make sure 1024 long records are enough, DELCHR CR/LF's from the export side, and read it as a normal text file? Implement a fixed-width format? XMLPort? Theoretically possible, except that I am not sure if it fixes that f.e. when we have decimal points in one country and decimal commas in another etc.