Copying data between NAV tables using SQL

radek.bbradek.bb Member Posts: 49
edited 2013-11-29 in SQL General
Hi guys,

I want to copy data between two copies of the same object (create replica of the table) using SQL statement.
Something like:
insert into [Test Table 02] select * from [Test Table 01]
It gives an error:
Cannot insert an explicit value into a timestamp column. Use INSERT with a column list to exclude the timestamp column, or insert a DEFAULT into the timestamp column.

As suggested by the error message I can change this code into
insert into [Test Table 02] (field1, field2,...,field100) select field1, field2,...,field100 from [Test Table 01]
or
insert into [Test Table 02] select DEFAULT,field1, field2,...,field100 from [Test Table 01]

But I don't like those... I need a solution without listing the columns.
According to help, for example, when you use TRANSFERFIELDS(..., FALSE) in NAV - timestamp value will not change. How they achieve this?

Is there any setting which forces SQL to accept value in timestamp field?

Thanks

Answers

  • ara3nara3n Member Posts: 9,256
    you need to specify all the fields on sql. Why are doing this through sql?
    Ahmed Rashed Amini
    Independent Consultant/Developer


    blog: https://dynamicsuser.net/nav/b/ara3n
  • radek.bbradek.bb Member Posts: 49
    I expect it to be faster than using RecRef in C/AL.

    Just preparing SQL_statement in NAV code and then execute it as
    ADO.Execute(SQL_Statement)

    But the trick is ADO.execute is failing when statement length is more than 2,000 characters - and there are tables where single column listing (you need to list them twice per statement) is longer than 5,000 characters.... so it happens sometimes that whole statement is about 12,000 characters...

    That's why I need solution without column names - a generic one.
  • krikikriki Member, Moderator Posts: 9,112
    It is faster, but you can't use SELECT *.

    So the best (but slower) way is using recordref.
    Regards,Alain Krikilion
    No PM,please use the forum. || May the <SOLVED>-attribute be in your title!


  • radek.bbradek.bb Member Posts: 49
    This is why I am asking here - in search for a clever trick :)

    I got for example:
    SELECT * INTO #temp_1 FROM [Test Table 01] 
    ALTER TABLE #temp_1 DROP COLUMN [timestamp]
    INSERT INTO [Test Table 02] (column1,column2,...,column100) SELECT * FROM #temp_1
    
    As you can see by using temporary table I managed to reduce statement to require only one column listing (as for second I can now use SELECT *).
    But first column listing (column1,column2,..,column100) is still required - I need something clever to remove it as well...
    :-k
  • lyngelynge Member Posts: 85
    Can you drop the destination table before each copy?

    I belive a "SELECT * INTO x2 FROM x1..." will keep timestamps in TSQL if the x2 table does not exist. It will be auto created with same table structure as x1.

    Another possible way (not testet) is to change the datatype of the timestamp row in x2 to binary(8) which should be what a timestamp is stored in anyway.
    But I have no clue how NAV will react to such a table and you will probably get into problems if you modify rows later (the binary(8) will not be updated).

    Regards
    Gert Lynge
  • ndbcsndbcs Member Posts: 33
    Create a stored proc and parse source and destination table as a parameter.
    Inside of the SP you can build the sql statement in a varchar(max) variable and run it with exec or sp_executesql.
    So the length of your statement doesn't matter.
    You can call the SP from NAV via ADO.

    I've also seen solutions inside NAV using 'Microsoft Script Control 1.0'.ScriptControl-Automation.
    You can build a vbscript using ADO in an instance of the ScriptControl variable.
    With the function ScriptControl.AddCode you can add as many code as you need.

    I prefer the stored proc.

    regards
    ndbcs
  • bbrownbbrown Member Posts: 3,268
    Are you sure you really want to do this? You might want to consider Kriki's advice.

    Remember that database updates made by external calls are not under NAV's transaction control. That means if a NAV error occurs after you have run your external code, the changes made by NAV code are rolled back, but not those by the external code. This can leave your DB in an unexpected condition. That's not to say direct SQL can't be useful.
    There are no bugs - only undocumented features.
  • radek.bbradek.bb Member Posts: 49
    lynge wrote:
    Can you drop the destination table before each copy?
    I belive a "SELECT * INTO x2 FROM x1..." will keep timestamps in TSQL if the x2 table does not exist. It will be auto created with same table structure as x1.
    This is worth exploring...
    Table created this way is losing some part of its structure and NAV is no longer liking it - but maybe we can do something about this...
    lynge wrote:
    Another possible way (not testet) is to change the datatype of the timestamp row in x2 to binary(8) which should be what a timestamp is stored in anyway.
    But I have no clue how NAV will react to such a table and you will probably get into problems if you modify rows later (the binary(8) will not be updated).
    SQL does not like this idea:
    Cannot alter column 'timestamp' because it is 'timestamp'.
    ndbcs wrote:
    Create a stored proc and parse source and destination table as a parameter.
    Inside of the SP you can build the sql statement in a varchar(max) variable and run it with exec or sp_executesql.
    So the length of your statement doesn't matter.
    You can call the SP from NAV via ADO.
    Yes... probably it is the safest option. I would prefer to use NAV code but this could be acceptable compromise... :)
    ndbcs wrote:
    I've also seen solutions inside NAV using 'Microsoft Script Control 1.0'.ScriptControl-Automation.
    You can build a vbscript using ADO in an instance of the ScriptControl variable.
    With the function ScriptControl.AddCode you can add as many code as you need.
    Maybe I will play with this idea sometime... but I also prefer stored proc ;)
    bbrown wrote:
    Are you sure you really want to do this? You might want to consider Kriki's advice....
    Yes :) It is a migration exercise - not a solution for daily operations/transactions. I am not worried about transaction control here.

    Thanks everyone! :)
  • ara3nara3n Member Posts: 9,256
    have you tried bulk insert?

    http://blog.sqlauthority.com/2008/02/06 ... ql-server/


    you don't need to specify the fields
    BULK
    INSERT CSVTest
    FROM 'c:\csvtest.txt'
    WITH
    (
    FIELDTERMINATOR = ',',
    ROWTERMINATOR = '\n'
    )
    GO
    
    Ahmed Rashed Amini
    Independent Consultant/Developer


    blog: https://dynamicsuser.net/nav/b/ara3n
  • rmv_RUrmv_RU Member Posts: 119
    I have written a sql procedure and use it as a transferfields.
    ALTER proc [dbo].[system_TransferTable] 
    @from_table varchar(100)
    , @to_table varchar(100)
    
    as begin
    --перенос данных из одной таблицы в другую, при этом поля маппятся по названию поля.
    --работает только вставка и только из временных таблиц источников (в целях защиты от некорректного вызова) 
    	if left (@from_table, 1)<>'#' begin
    		raiserror('system_TransferTable works only for temporary tables', 16,1)
    	end
    	declare @sql  varchar(4000)
    	declare @fields_list varchar(4000)
    	set @fields_list=''
    	select @fields_list =@fields_list + '[' + t1.[name] + '],' from tempdb.dbo.syscolumns t1 
    	inner join syscolumns t2 on t1.name=t2.name
    	where t1.id = object_id('tempdb.dbo.' + @from_table) and t1.xtype <> 189 
    		and t2.id = object_id(@to_table) and t2.xtype <> 189 
    		
    	set @fields_list=left(@fields_list, len(@fields_list) - 1)
    	set @sql='insert into ' + @to_table + '(' + @fields_list + ') 
    	select ' + @fields_list + ' from ' + @from_table 
    	
    	print @sql
    	exec (@sql)
    
    end
    
    bbrown wrote:
    Are you sure you really want to do this? You might want to consider Kriki's advice.
    Remember that database updates made by external calls are not under NAV's transaction control. That means if a NAV error occurs after you have run your external code, the changes made by NAV code are rolled back, but not those by the external code. This can leave your DB in an unexpected condition. That's not to say direct SQL can't be useful.
    It's true and i hope that MS will create interface to current sql connection in future versions. Now I have to use this trick to make changes by SQL in same transaction as a Nav.
    Looking for part-time work.
    Nav, T-SQL.
Sign In or Register to comment.