Options

TSQL to process a high volume of data?

JutJut Member Posts: 72
edited 2007-12-14 in SQL General
At first, I sent this text to one well-known :) Mibuso-User who had told in the thread "Replacing CAL with SQL" about experiences using TSQL in Combination with NAV to process massive data.

In fact, currently I work on a similar project. The customer has a relatively high posting load (around about 150.000 posted Orders per week). Due to the need of having many information on every customer (e.g. sum of all payed invoices, ratio payments / invoices etc.) we create some sort of "Recording-Function" that calculates all these information in a nightly batch and writes the sums and ratios directly into the customer-table. Unfortunately our time frame is very limited and the batch needs currently too much time. We already implemented many Code-optimizations in NAV e.g. only re-calculating the changed values and newly created ones since the last execution of the batch. Unfortunately the customer also needs all these information limited to all transactions within one year (that means for example all payments that were made during the last 12 months).

After having read a lot about performing Table-Read and Writes directly in SQL I though about using a stored procedure or anything else to do the calculation and modification of the changed Customers. Has anyone some experiences doing this sort of programming and the results? As far as I can imagine, I think I would need some Cursors iterating my Source-Table (Detailed Cust. Entry in Order to avoid using the Flowfields like Amount) and a Customer-Specific Table with about 3 million data originating from the old system. Sadly I see no way to process these data more Set-based why I only could think about the cursor. Maybe I'm wrong?

I would be very glad to participate a little bit on your experiences!

Thanks a lot in advance!

Jut from Kiel in Germany

Comments

  • Options
    Marije_BrummelMarije_Brummel Member, Moderators Design Patterns Posts: 4,262
    If you are going to use cursors in your sp then I am not sure if you are going to save a lot of time.

    In T-SQL you can do stuff like generating temp tables on the fly and work with covering indexes.

    You might even consider creating smaller tables that contain the data to be calclated and replicate them using SSIS.

    I have worked with a highly experienced T-SQL engineer that did miracles. If you let the management of these guys be done by a NAV engineer you should be able to do a good project.

    Good luck!
  • Options
    cCodecCode Member Posts: 6
    If I understood you right, you are summing all in Nav.

    Have you considering to export/import all needed data into native Sql-Server enviroment and do summing there and then import result back into Nav? Maybe using different server or even different database e.g sybase iq, which is optimize for this kind of jobs.

    My experiences with huge datawarehouses (billions of rows) and datamarts is that summing millions of rows is not a big issue. Of course depending on resources. Using bcp utility (database with simple option) to load Nav export-file should done job relative quickly. After summing, use bcp again to import sums and finally dataport to import sums back into Nav.

    I think you should gain despite of overhead moving data back and forward between native sql-server and Nav enviroment. If you do all summing in Nav, it had always some sort of overhead and when it concernig bigger data mass, cost will be considerable.
Sign In or Register to comment.