I am getting the above error when importing a text object file. This is a 4.0 SP3 native database. I searched here and PartnerSource and found no reference to this error.
It's the codeunit created by the SQL Migrate utility. I've been working on some changes. It's likely my changes that are causing the error but it would be helpful to know something about the error to track down the cause.
Internal Error 1291 in Module 19, :: 19-1291 #Err_DB_EvalDestBufferTooSmall DB_Err(1291)
This does seem to be related to some sort of buffer size limit. If I reduce my text file size (by removing code) the file imports. Is there a file size limit on importing a text file? My file is just shy of 3 MB. The file fails to import at 2762 KB but if I reduce it to 2761 KB it imports.
No, there is no limit, You can import txt for all files at once without problem (around 50 MB). It seems that there is another problem. Did you split the file or just removed something from the file? I assume that you have splitted it. Could you import both parts without problem?
"You have reached the allowed number of permissions"
It appears you can only assign 80 permissions to an object. In my code I was running thru table objects and assigning permissions to the codeunit. When I reduced the number it worked. It would work at 80 but not 81. When I tried to added an 81st permission manually (after importing and compiling) I would get the above error.
It's looking like this will a "you can't do that".
I'm curios why you are doing the mod, considering that it needs to run only once?
I'm working toward a conversion where the customer does not have enough downtime to run the process in one pass. I'm also expecting a number of similar conversion in the near future. I'm modifying the process for two things. 1) the ability to run it a company at a time. and 2) the ability to log where it ended on ledger tables. That way it can just process the new data the next time. This works with tables that using increasing integers as primary key. But since they account for a large amount of data it should help.
My last goal was to allow the user to run with their license. But it looks like I will need to skip that idea.
you can run the process in a copy of production and identify the records. Then fix those in production. You can have a separate route that look at the new data after the copy of production was made. that way you don't have to run it twice for all entries, and would eliminate a lot of time.
Ahmed Rashed Amini
Independent Consultant/Developer
I have used that approach before, but usually this process only result in finding a few bad records. Maybe a few hundred. A test run I did a few months ago found over 10,000 issues in one company. A bit much to manually update based on a run in a copy.
Comments
is it from same version of NAV?
Independent Consultant/Developer
blog: https://dynamicsuser.net/nav/b/ara3n
http://wiki.dynamicsbook.com/index.php? ... r_messages. .
Internal Error 1291 in Module 19, :: 19-1291 #Err_DB_EvalDestBufferTooSmall DB_Err(1291)
This does seem to be related to some sort of buffer size limit. If I reduce my text file size (by removing code) the file imports. Is there a file size limit on importing a text file? My file is just shy of 3 MB. The file fails to import at 2762 KB but if I reduce it to 2761 KB it imports.
MVP - Dynamics NAV
My BLOG
NAVERTICA a.s.
It appears you can only assign 80 permissions to an object. In my code I was running thru table objects and assigning permissions to the codeunit. When I reduced the number it worked. It would work at 80 but not 81. When I tried to added an 81st permission manually (after importing and compiling) I would get the above error.
It's looking like this will a "you can't do that".
Independent Consultant/Developer
blog: https://dynamicsuser.net/nav/b/ara3n
I'm working toward a conversion where the customer does not have enough downtime to run the process in one pass. I'm also expecting a number of similar conversion in the near future. I'm modifying the process for two things. 1) the ability to run it a company at a time. and 2) the ability to log where it ended on ledger tables. That way it can just process the new data the next time. This works with tables that using increasing integers as primary key. But since they account for a large amount of data it should help.
My last goal was to allow the user to run with their license. But it looks like I will need to skip that idea.
Independent Consultant/Developer
blog: https://dynamicsuser.net/nav/b/ara3n