Dataport - be careful!

zeninolegzeninoleg Member Posts: 236
Hi,
I have just found an interesting "undocumented feature" in Navision.
I had a table with following fields:
Field1,Field2,Field3

I had also a dataport with the same fields for this table. So I have done a comma-separated file:
1,f1,f2,
2,f3,
3,f5,f6,
4,,

Here is what appears to be populated in the table after dataport was run:
Field1 Field2 Field3
1 F1 F2
2 F3 F2
3 F5 F6
4 F6
Interesting....... :-k
So line 2 has the same value for the Field3 as the line 2! So dataport does not clear record after each record and keeps the old record.
Notice that the same happened to the last record as well(the value for Field3 came from line 3). But it looks like this only happens to the last field because value for field2 for the line 4 was not brought down from the line 3!
I did some more testing on it and discovered that if there is a space in the file after last comma, dataport works fine.
I think that can be very dangerous when importing lots of data and some fields at the end of the line can be blank.

I quess I will have to add clearing of the record to all dataports.........
Best Regards,
Oleg

Comments

  • zeninolegzeninoleg Member Posts: 236
    PS We have Navision 3.70 SQL
    Best Regards,
    Oleg
  • Miklos_HollenderMiklos_Hollender Member Posts: 1,598
    Yes, one of the "features" that "why document, everybody knows them".

    The "traditional" approach is putting an INIT into the OnBeforeImport.
  • krikikriki Member, Moderator Posts: 9,118
    Yes, one of the "features" that "why document, everybody knows them".

    The "traditional" approach is putting an INIT into the OnBeforeImport.
    Better a CLEAR: This cleans also the primary-key-fields.
    Regards,Alain Krikilion
    No PM,please use the forum. || May the <SOLVED>-attribute be in your title!


  • zeninolegzeninoleg Member Posts: 236
    I think that I am loosing faith in dataports. For the complex imports I prefer to write a report with Integer dataitem and parse the line myself... :D
    Best Regards,
    Oleg
  • DenSterDenSter Member Posts: 8,307
    Dataports are great things, you just need to know how to use them. I've learned to import/export everything through the use of variables, and program the data parts myself.
  • Miklos_HollenderMiklos_Hollender Member Posts: 1,598
    I've lost that trust long ago, this is why I uploaded this: http://www.mibuso.com/dlinfo.asp?FileID=579

    The basic idea is to import data to intermediate tables and write code to check up all the possible problems (missing relations etc.) and also review manually the damage done by the combined efforts of Excel and Navision (such as Excel turning integers into dates and dates into hell-knows-what).

    One funny story. I had a client who copy-pasted e-mails into comments, and therefore Enter, Tab, semicolon, everything was there. How does one set field and record separators in this case? This is when Buddhism comes handy: I invented the SSV file format - Sanskrit Separated Values. PRATITYA-SAMUTPADA for record separators, MADHYAMIKA-KARIKA for field separators. It worked :) My boss only sighed and said "Is that some rule of nature that productive people are always completely crazy?" :):):)
  • Miklos_HollenderMiklos_Hollender Member Posts: 1,598
    DenSter,

    that's still not good enough, because if you got f.e. starting inventory for 100 000 Items and 100 Items are missing from Item master data - which would be a very good ratio, indeed, only 0,1% error - then it stops 100 times. This is why I think importing to intermediate tables is better.
  • DenSterDenSter Member Posts: 8,307
    Nope I disagree, it would only stop if you program it to stop. You can program the dataport to skip, to stop, to dump data in a file, to save records in temporary tables, send xml messages, emails, whatever you want. I love dataports, I just program most of it myself.
  • Miklos_HollenderMiklos_Hollender Member Posts: 1,598
    But why not put data in intermediate tables? It simply does not feel natural to do otherwise. In some other systems it's called a "staging database". Where you can review what you imported, and check it, change it, whatever, before putting it in the place it belongs.
  • DenSterDenSter Member Posts: 8,307
    I'm not saying there's anything wrong with importing into staging tables in general, I'm only saying that dataports can be very useful tools, if you know how to use them.

    To the point though, I just happen to think that you don't NEED to import inventory info into staging tables. You import inventory information into an Item journal, and then you check the numbers, and then you run the posting routine. The journal is already an intermediate table, why would you want to import it into a third one? It is redundant.

    Besides, it's not the import process that determines the outcome of datamigration, it's the check for balances after the info is posted. If that number is not correct, then it doesn't matter if you imported using staging tables or directly.
  • Alex_ChowAlex_Chow Member Posts: 5,063
    I agree with Denster, importing into a variable instead of table directly eliminates a lot of potential problems that might come up.

    Although importing into a staging table is okay as well, but to me, it seems like an extra step.
  • Alex_ChowAlex_Chow Member Posts: 5,063
    But why not put data in intermediate tables? It simply does not feel natural to do otherwise. In some other systems it's called a "staging database". Where you can review what you imported, and check it, change it, whatever, before putting it in the place it belongs.

    No one way is better than the other. It's whatever you're used to.

    Programming dataport itself is "empty". Miklos, you should know what I mean.
  • DenSterDenSter Member Posts: 8,307
    deadlizard wrote:
    No one way is better than the other. It's whatever you're used to.
    That I do agree with. It's what you're used to and whatever the situation calls for. Come to think of it, it could happen that you don't trust the legacy data files, and you want to prevent accidental posting, so in that case an additional staging table might be necessary.

    I love this job, figuring out what the best way to do something is :mrgreen:
  • Miklos_HollenderMiklos_Hollender Member Posts: 1,598
    Yeah, it can depend on many things. The reason I like intermediate tables is that when I need to import f.e. Item Journals for a starting inventory, there are always

    - missing Items "Oh, you need that too? They haven't been sold for year!"
    - missing Locations "Oh, you need that too? They are just scrap locations."

    and so on.

    Another thing is that Excel always messes things up, but that may be just a local problem (Excel just wants to be "too local" in understanding data formats) and sometimes it's hard to figure out when a DataPort stops that 32182 is a malformed date by Excel.
  • Miklos_HollenderMiklos_Hollender Member Posts: 1,598
    Maybe... my custom is to request data from customers in Excel and maybe your custom is too look at the legacy database directly. I gave up the later approach when I had to migrate from Scala and it had clever, descriptive table names, like, "1", "2" and "3", and clever, descriptive field names like... guess what?... "1", "2", and "3" :D So then I just though f**k, from now on, I will request the data in Excel and leave extraction to the clients. But it has the drawbacks that "unnecessary" data sometimes does not get sent :D
  • DenSterDenSter Member Posts: 8,307
    Oh we get data in a variety of ways, excel sheets being one of the more common ones. Most of the time I don't have access to the legacy system at all.

    The point being that both approaches have their time and place. I am certainly going to take a closer look at your data migration tools next time I need to migrate data.
  • Miklos_HollenderMiklos_Hollender Member Posts: 1,598
    Maybe I'm a little bit offtopic with this post, but I MUST ask it. It seems the all of us in this topic completely agreed in something: data migration is a development task. When I started with Navision, in 2002, I was an analyst. (A pretty junior one.) I tried to import data. It did not work. I tried to modify it in Excel and import it. It did not work. We had many sleepless nights. We only found gradually that it's development work, not analyst work. So later on when we learned developing and finally accepted the fact that data migration can mean more code than a bunch of new functionalities, it went well. We did not now it before, because we thought development is for new features, and we didn't want new features, we just wanted to put the data in.

    Now, I think many people have this problem all over the world. Many recently graduated junior analyst - like I was in 2002 - are sitting right now before Excel and try to massage data to be imported into some ERP system, late at night, without success.

    How could we broadcast to them the message all over the world that "Hey, forget Excel, it's development work. Not only in Navision, but everywhere: for example, this is why the Unix folks created the Perl language for. So it's clearly a development task. Forget Excel."

    Or is it yet another thing that everybody can learn from is own mistake by sleepless nights? Could we somehow help them in it?
  • DenSterDenSter Member Posts: 8,307
    It's development work, but very junior type development work that does not require very deep development skills. Basically all it is is converting data types and mapping data into the right fields, and learning how to manipulate new and existing records. This is ideal stuff for starting NAV developers, it's small stuff with relatively quick results.

    Analysts can do this too, as long as they have a good technical understanding on a development detail level, you know. They must be able to unravel a process into all of the steps that it takes to accomplish a task with programming code.
  • Miklos_HollenderMiklos_Hollender Member Posts: 1,598
    Sometimes it can be a bit harder. Often a legacy database has widly different structure. F.e. some systems have only one Partners master data [1] and not different Customers and Vendors. Therefore, a migration task often looks like "if it had sales, migrate him to Customers, if it had purchases, migrate it to Vendorss, if both, then to both, if none, make a list of it and let the users decide".

    [1] Which is a good idea I think.
  • DenSterDenSter Member Posts: 8,307
    If there is no automated way to determine it, I would give the customer the whole list, and let them tell me which are customers and which are vendors, and either split up the file or add a field that tells my dataport which one it is. You'd have to do this using variables anyway.

    You could also import it into staging tables :mrgreen: and give them an option field, and then run a process to create the actual master data.
  • Miklos_HollenderMiklos_Hollender Member Posts: 1,598
    Yes, you are right. I just wanted to imply that migration is not alway a junior stuff. I once left it to a junior colleague... and we ended up with reversed currency exchange rates... like, 1 HUF costing 250 EUR :) OK, running a quick program to repair it in all the entries and documents solved it but we got a bit scared :)
  • DenSterDenSter Member Posts: 8,307
    which brings us to a totally different matter of people knowing what they can and cannot do :mrgreen:
Sign In or Register to comment.