XML Handling with COM Automation / DotNet

wakestar
Member Posts: 207
Let's assume we have the following scenario:
- We want to handle complex XML files (export and import, XMLPorts have limitations... so they are not an option in this case)
- We want to separate business code from the technical code
- We want to implement this in older (Classic COM Automation) and for newer NAV Clients (DotNet Interop)
My question is: How do you structure the code so that business logic is separated from the technical xml stuff completely?
When I look at the microsoft code I see that the business code and the technical code (XML Dom Automation) are always linked together
because the CurrXMLNodes is used as a parameter. -> I want to avoid that because I wanna be able to make changes in the business code
object and import the business object to the other NAV Versions without worrying about COM Automation or DotNet Interop.
- We want to handle complex XML files (export and import, XMLPorts have limitations... so they are not an option in this case)
- We want to separate business code from the technical code
- We want to implement this in older (Classic COM Automation) and for newer NAV Clients (DotNet Interop)
My question is: How do you structure the code so that business logic is separated from the technical xml stuff completely?
When I look at the microsoft code I see that the business code and the technical code (XML Dom Automation) are always linked together
because the CurrXMLNodes is used as a parameter. -> I want to avoid that because I wanna be able to make changes in the business code
object and import the business object to the other NAV Versions without worrying about COM Automation or DotNet Interop.
0
Comments
-
I would be very interested too in this topic... Anybody? :-k0
-
To separate the parsing of the xml from business logic is to write the data into staging tables.
so the parser will write that data into staging/temp table.
then the business logic code will process that table data.0 -
ara3n wrote:To separate the parsing of the xml from business logic is to write the data into staging tables.
so the parser will write that data into staging/temp table.
then the business logic code will process that table data.
Some other tricks I use:
-Having a form/page on the staging tables in which one can write data manually to test the business logic without the technical details of the XML.
-I also create code (in general an XML-port) to EXPORT the data from the table. After I exported the data, I try to import it again to see if it works and I don't need to create an XML-file manually.
-If you have webservices, you can test them using powershell : http://www.mibuso.com/howtoinfo.asp?FileID=24Regards,Alain Krikilion
No PM,please use the forum. || May the <SOLVED>-attribute be in your title!0 -
+1.
Actually I am working on a series of design patterns for interfacing. The first one is now in review with the team.
This video explains how to work with temporary tables in Dynamics NAV.
https://www.youtube.com/watch?v=QHn5oEO ... yHpsVN0U_w0 -
are you guys using *one* generic staging table or one for each xml schema / business case / etc.?
A generic solution would be great.. but I think it is hard to implement / maintain since some xml structures are complicated as hell...0 -
I create a staging table per XML.
If used temporary staging tables are for free.
They are the perfect way to sctructure your code.
Actually they are like classes in C#. Makes life easier.
The databases I work with have dozens of temporary tables. Actually that becomes the hard part after a while. What the F*** did I use this for.0 -
I have a codeunit with helper functions that for example read a list of all sales headers from an external source (ADO/SQL, not XML, but the principle is the same) into a temporary Sales Header table, or read all sales lines for a given order number into a temporary Sales Line table. I try to use fields for the same purposes they will be used in the real tables, but sometimes I use them creatively, like e.g. the other software has a shipment surcharge per line, I put it into some field e.g. into Unit Cost. It would cleaner to define new tables for this but they cost 600 euro per 10, as it is no add-on, so nope.
Then I have a different codeunit with business logic functions, that simply take these temp tables and process them, create real records. For example it will turn a TempSalesLine into a Sales Line more or less the same way, and it will take the shipment surcharge I have put into the Unit Cost field and make another Sales Line for that.0 -
Mark Brummel wrote:
If used temporary staging tables are for free.
It seems to me it longer works in 2013R2. I had a ClosedXML Excel Buffer outside my licence range, as it is only a temp table, I thought it is okay. Nope, I got the no permission error message. I think they closed this "loophole".
As a second thought, maybe it was about calling functions on it, not putting data into it...0 -
I think using strict flat xml format and a serializable function RecRef2XML is the best option in this case.
You can define RecRef2XML to represent any record as<r t="@table_name"> <f n="@field_name" value="123"/>\ ..... <f..../> <r>
And then convert flat xml from|to any complex XML formats using other tools, a Biztalk for an example.
RecRef2XML can work without any setup tables, but using mapping table has more benefits:
1. Export only necessary fields, as a result - reduce handling time and XML size.
2. Define FK-PK and header-lines relations between tables and use recursive export, as a result - export whole integral part of data, it doesn't matter in which table the data located . For example export fo sales header will cause recursive export mapped directories such as customer table and all linked lines such as sales line table export any sales line will cause export item record and so on...
However, XML handling (mapping-serialization-deserialization) usually take more time than ADO.
Thats why we use custom mapping functionality and ADO Recordsets to integrate Nav databases with different corporate non-Nav SQL databases and master-data Nav database, for document exchange between different Nav databases (for example a sales header from a central database can be imported as purchase order in a branch database).
Typical syncronization scenario can include queries to different databases, for example you can start PurchaseHeader syncronization and then recursive import sales lines as purchase lines from an one Nav database and during every line import syncronize item of this line with master-data Nav database.Looking for part-time work.
Nav, T-SQL.0 -
<?xml version="1.0" encoding="utf-8" ?>
<suite>
<testcase id="001" kind="bvt">
<inputs>
<arg1>4</arg1>
<arg2>7</arg2>
</inputs>
<expected>11.00</expected>
</testcase>
<testcase id="002" kind="drt">
<inputs>
<arg1>9</arg1>
<arg2>6</arg2>
</inputs>
<expected>15.00</expected>
</testcase>
<testcase id="003" kind="bvt">
<inputs>
<arg1>5</arg1>
<arg2>8</arg2>
</inputs>
<expected>13.00</expected>
</testcase>
<
/suite>
This will helps you0
Categories
- All Categories
- 73 General
- 73 Announcements
- 66.6K Microsoft Dynamics NAV
- 18.7K NAV Three Tier
- 38.4K NAV/Navision Classic Client
- 3.6K Navision Attain
- 2.4K Navision Financials
- 116 Navision DOS
- 851 Navision e-Commerce
- 1K NAV Tips & Tricks
- 772 NAV Dutch speaking only
- 617 NAV Courses, Exams & Certification
- 2K Microsoft Dynamics-Other
- 1.5K Dynamics AX
- 320 Dynamics CRM
- 111 Dynamics GP
- 10 Dynamics SL
- 1.5K Other
- 990 SQL General
- 383 SQL Performance
- 34 SQL Tips & Tricks
- 35 Design Patterns (General & Best Practices)
- 1 Architectural Patterns
- 10 Design Patterns
- 5 Implementation Patterns
- 53 3rd Party Products, Services & Events
- 1.6K General
- 1.1K General Chat
- 1.6K Website
- 83 Testing
- 1.2K Download section
- 23 How Tos section
- 252 Feedback
- 12 NAV TechDays 2013 Sessions
- 13 NAV TechDays 2012 Sessions