Let's assume we have the following scenario:
- We want to handle complex XML files (export and import, XMLPorts have limitations... so they are not an option in this case)
- We want to separate business code from the technical code
- We want to implement this in older (Classic COM Automation) and for newer NAV Clients (DotNet Interop)
My question is: How do you structure the code so that business logic is separated from the technical xml stuff completely?
When I look at the microsoft code I see that the business code and the technical code (XML Dom Automation) are always linked together
because the CurrXMLNodes is used as a parameter. -> I want to avoid that because I wanna be able to make changes in the business code
object and import the business object to the other NAV Versions without worrying about COM Automation or DotNet Interop.
0
Comments
so the parser will write that data into staging/temp table.
then the business logic code will process that table data.
Independent Consultant/Developer
blog: https://dynamicsuser.net/nav/b/ara3n
Some other tricks I use:
-Having a form/page on the staging tables in which one can write data manually to test the business logic without the technical details of the XML.
-I also create code (in general an XML-port) to EXPORT the data from the table. After I exported the data, I try to import it again to see if it works and I don't need to create an XML-file manually.
-If you have webservices, you can test them using powershell : http://www.mibuso.com/howtoinfo.asp?FileID=24
No PM,please use the forum. || May the <SOLVED>-attribute be in your title!
Actually I am working on a series of design patterns for interfacing. The first one is now in review with the team.
This video explains how to work with temporary tables in Dynamics NAV.
https://www.youtube.com/watch?v=QHn5oEO ... yHpsVN0U_w
A generic solution would be great.. but I think it is hard to implement / maintain since some xml structures are complicated as hell...
If used temporary staging tables are for free.
They are the perfect way to sctructure your code.
Actually they are like classes in C#. Makes life easier.
The databases I work with have dozens of temporary tables. Actually that becomes the hard part after a while. What the F*** did I use this for.
Then I have a different codeunit with business logic functions, that simply take these temp tables and process them, create real records. For example it will turn a TempSalesLine into a Sales Line more or less the same way, and it will take the shipment surcharge I have put into the Unit Cost field and make another Sales Line for that.
It seems to me it longer works in 2013R2. I had a ClosedXML Excel Buffer outside my licence range, as it is only a temp table, I thought it is okay. Nope, I got the no permission error message. I think they closed this "loophole".
As a second thought, maybe it was about calling functions on it, not putting data into it...
You can define RecRef2XML to represent any record as And then convert flat xml from|to any complex XML formats using other tools, a Biztalk for an example.
RecRef2XML can work without any setup tables, but using mapping table has more benefits:
1. Export only necessary fields, as a result - reduce handling time and XML size.
2. Define FK-PK and header-lines relations between tables and use recursive export, as a result - export whole integral part of data, it doesn't matter in which table the data located . For example export fo sales header will cause recursive export mapped directories such as customer table and all linked lines such as sales line table export any sales line will cause export item record and so on...
However, XML handling (mapping-serialization-deserialization) usually take more time than ADO.
Thats why we use custom mapping functionality and ADO Recordsets to integrate Nav databases with different corporate non-Nav SQL databases and master-data Nav database, for document exchange between different Nav databases (for example a sales header from a central database can be imported as purchase order in a branch database).
Typical syncronization scenario can include queries to different databases, for example you can start PurchaseHeader syncronization and then recursive import sales lines as purchase lines from an one Nav database and during every line import syncronize item of this line with master-data Nav database.
Nav, T-SQL.
<suite>
<testcase id="001" kind="bvt">
<inputs>
<arg1>4</arg1>
<arg2>7</arg2>
</inputs>
<expected>11.00</expected>
</testcase>
<testcase id="002" kind="drt">
<inputs>
<arg1>9</arg1>
<arg2>6</arg2>
</inputs>
<expected>15.00</expected>
</testcase>
<testcase id="003" kind="bvt">
<inputs>
<arg1>5</arg1>
<arg2>8</arg2>
</inputs>
<expected>13.00</expected>
</testcase>
<
/suite>
This will helps you