XML Handling with COM Automation / DotNet

wakestarwakestar Member Posts: 207
Let's assume we have the following scenario:

- We want to handle complex XML files (export and import, XMLPorts have limitations... so they are not an option in this case)
- We want to separate business code from the technical code
- We want to implement this in older (Classic COM Automation) and for newer NAV Clients (DotNet Interop)

My question is: How do you structure the code so that business logic is separated from the technical xml stuff completely?

When I look at the microsoft code I see that the business code and the technical code (XML Dom Automation) are always linked together
because the CurrXMLNodes is used as a parameter. -> I want to avoid that because I wanna be able to make changes in the business code
object and import the business object to the other NAV Versions without worrying about COM Automation or DotNet Interop.

Comments

  • scasyscasy Member Posts: 1
    I would be very interested too in this topic... Anybody? :-k
  • ara3nara3n Member Posts: 9,256
    To separate the parsing of the xml from business logic is to write the data into staging tables.

    so the parser will write that data into staging/temp table.

    then the business logic code will process that table data.
    Ahmed Rashed Amini
    Independent Consultant/Developer


    blog: https://dynamicsuser.net/nav/b/ara3n
  • krikikriki Member, Moderator Posts: 9,115
    ara3n wrote:
    To separate the parsing of the xml from business logic is to write the data into staging tables.

    so the parser will write that data into staging/temp table.

    then the business logic code will process that table data.
    I confirm that this is the best way. I am also using this way.

    Some other tricks I use:
    -Having a form/page on the staging tables in which one can write data manually to test the business logic without the technical details of the XML.
    -I also create code (in general an XML-port) to EXPORT the data from the table. After I exported the data, I try to import it again to see if it works and I don't need to create an XML-file manually.
    -If you have webservices, you can test them using powershell : http://www.mibuso.com/howtoinfo.asp?FileID=24
    Regards,Alain Krikilion
    No PM,please use the forum. || May the <SOLVED>-attribute be in your title!


  • Marije_BrummelMarije_Brummel Member, Moderators Design Patterns Posts: 4,262
    +1.

    Actually I am working on a series of design patterns for interfacing. The first one is now in review with the team.

    This video explains how to work with temporary tables in Dynamics NAV.

    https://www.youtube.com/watch?v=QHn5oEO ... yHpsVN0U_w
  • wakestarwakestar Member Posts: 207
    are you guys using *one* generic staging table or one for each xml schema / business case / etc.?

    A generic solution would be great.. but I think it is hard to implement / maintain since some xml structures are complicated as hell...
  • Marije_BrummelMarije_Brummel Member, Moderators Design Patterns Posts: 4,262
    I create a staging table per XML.

    If used temporary staging tables are for free.

    They are the perfect way to sctructure your code.

    Actually they are like classes in C#. Makes life easier.

    The databases I work with have dozens of temporary tables. Actually that becomes the hard part after a while. What the F*** did I use this for. :mrgreen:
  • Miklos_HollenderMiklos_Hollender Member Posts: 1,598
    I have a codeunit with helper functions that for example read a list of all sales headers from an external source (ADO/SQL, not XML, but the principle is the same) into a temporary Sales Header table, or read all sales lines for a given order number into a temporary Sales Line table. I try to use fields for the same purposes they will be used in the real tables, but sometimes I use them creatively, like e.g. the other software has a shipment surcharge per line, I put it into some field e.g. into Unit Cost. It would cleaner to define new tables for this but they cost 600 euro per 10, as it is no add-on, so nope.

    Then I have a different codeunit with business logic functions, that simply take these temp tables and process them, create real records. For example it will turn a TempSalesLine into a Sales Line more or less the same way, and it will take the shipment surcharge I have put into the Unit Cost field and make another Sales Line for that.
  • Miklos_HollenderMiklos_Hollender Member Posts: 1,598

    If used temporary staging tables are for free.

    It seems to me it longer works in 2013R2. I had a ClosedXML Excel Buffer outside my licence range, as it is only a temp table, I thought it is okay. Nope, I got the no permission error message. I think they closed this "loophole".

    As a second thought, maybe it was about calling functions on it, not putting data into it...
  • rmv_RUrmv_RU Member Posts: 119
    I think using strict flat xml format and a serializable function RecRef2XML is the best option in this case.
    You can define RecRef2XML to represent any record as
    <r t="@table_name">
    	<f n="@field_name" value="123"/>\
    	.....
    	<f..../>
    <r>
    
    And then convert flat xml from|to any complex XML formats using other tools, a Biztalk for an example.
    RecRef2XML can work without any setup tables, but using mapping table has more benefits:
    1. Export only necessary fields, as a result - reduce handling time and XML size.
    2. Define FK-PK and header-lines relations between tables and use recursive export, as a result - export whole integral part of data, it doesn't matter in which table the data located . For example export fo sales header will cause recursive export mapped directories such as customer table and all linked lines such as sales line table export any sales line will cause export item record and so on...

    However, XML handling (mapping-serialization-deserialization) usually take more time than ADO.
    Thats why we use custom mapping functionality and ADO Recordsets to integrate Nav databases with different corporate non-Nav SQL databases and master-data Nav database, for document exchange between different Nav databases (for example a sales header from a central database can be imported as purchase order in a branch database).
    Typical syncronization scenario can include queries to different databases, for example you can start PurchaseHeader syncronization and then recursive import sales lines as purchase lines from an one Nav database and during every line import syncronize item of this line with master-data Nav database.
    Looking for part-time work.
    Nav, T-SQL.
  • rosietesmenrosietesmen Member Posts: 6
    <?xml version="1.0" encoding="utf-8" ?>
    <suite>

    <testcase id="001" kind="bvt">
    <inputs>
    <arg1>4</arg1>
    <arg2>7</arg2>
    </inputs>
    <expected>11.00</expected>
    </testcase>

    <testcase id="002" kind="drt">
    <inputs>
    <arg1>9</arg1>
    <arg2>6</arg2>
    </inputs>
    <expected>15.00</expected>
    </testcase>

    <testcase id="003" kind="bvt">
    <inputs>
    <arg1>5</arg1>
    <arg2>8</arg2>
    </inputs>
    <expected>13.00</expected>
    </testcase>
    <
    /suite>


    This will helps you
Sign In or Register to comment.