Issue with Importing very large BLOBs

igor.chladiligor.chladil Member Posts: 28
Hello,

I have a NAV5 DB on SQL and I am trying to import into the BLOB the 260 MBs large XML file.
This import crashes with the Navision error I have never seen before:
There is not enough memory to execute this function.
If you work in a single-user installation, you can try reducing the value of the 'cache' program property.
You can find information about how to optimize the operating system in the documentation for your operating system.

I have simulated this error on 2 independant machines.

Has anybody seen this error before and is there any solution to it?

Thanks
Igor

Comments

  • ara3nara3n Member Posts: 9,258
    There is a limit on how much data you can store on temporary records, and I'm guessing this is related.

    There is no workaround that I know of. You can contact MS and see if they have any solution.
    Ahmed Rashed Amini
    Independent Consultant/Developer


    blog: https://dynamicsuser.net/nav/b/ara3n
  • David_SingletonDavid_Singleton Member Posts: 5,479
    But the limit is 2gig, so this isn't really close.

    That error message by the way is very common, and it normally means a recursion. I.e. when a function calls it self to infinity. Maybe look at your code and in fact its not the BLOB causing the issue. Also some code like:
    While MyCondition  < 100 do
      MyCondition := 0;
    

    will also cause an infinite loop.

    Lets say you were using while to look for files on a drive in your import.
    David Singleton
  • ara3nara3n Member Posts: 9,258
    David, Try to load a 250 meg file into Navision.

    For example open Company Information card, and click on function->Import picture. Select any file, doesn't need to be a picture of that size and load it.

    You'll get the error, and nav will crash.
    Ahmed Rashed Amini
    Independent Consultant/Developer


    blog: https://dynamicsuser.net/nav/b/ara3n
  • igor.chladiligor.chladil Member Posts: 28
    Thanks for your replies. I will bypass this issue by not importing that large files for now.

    I have actually tried it several times and quite strangely once out of several attempts I have been able to import 700 MBs large file but I was not able to delete it from the table manually afterwards with the same message as during the import before.

    Regards,
    Igor
  • David_SingletonDavid_Singleton Member Posts: 5,479
    ara3n wrote:
    David, Try to load a 250 meg file into Navision.

    For example open Company Information card, and click on function->Import picture. Select any file, doesn't need to be a picture of that size and load it.

    You'll get the error, and nav will crash.

    :oops: You know I don't think I ever tried it #-o . I am sure that you are right.

    Surely then this is a bug, since the manuals have always stated 2gig as the limit.
    David Singleton
  • jlandeenjlandeen Member Posts: 524
    Are you using the import/export functions of the BLOB or are you using an InStream object and the CreateInstream function of the BLOB field? I haven't worked with such a large file in a BLOB before but I would think that it would work better to manually read through the file in binary form and stream it into a BLOB.
    Jeff Landeen - Sr. Consultant
    Epimatic Corp.

    http://www.epimatic.com
  • jlandeenjlandeen Member Posts: 524
    One other thing...there's some NAV functionality that may be applying a different constraint (i.e. the Import/Export function). While the NAV field itself may support 2Gb of data in it, it's a different thing to assume that all of the components/functions along the way have the same limits.

    I find that quite often when working with long data strings in XML and other parts of NAV that the environment has problems with lines of data that exceed 1024 characters and I have to manually use streams and other datatypes to access all the data - this may be something similar.
    Jeff Landeen - Sr. Consultant
    Epimatic Corp.

    http://www.epimatic.com
  • igor.chladiligor.chladil Member Posts: 28
    Hi Jeff,

    I've actually tried both IMPORT and COPYSTREAM functionality. But as Rashed has suggested the issue is that the BLOB seems to be first stored in NAV memory and stored in the SQL just when running INSERT/MODIFY. Therefore if there are any temporary memory limits in NAV - and there seems to be some - then these are applied.

    BTW these memory limits don't seem to be related to Physical/Virtual Memory size.

    Regards, Igor
  • jlandeenjlandeen Member Posts: 524
    Ahh ok I think I understand now - it's the modify statement where NAV tries to work with the complete record in memory, but it can't.

    Sadly I can't say I'm completely surprised, these things do occasionally crop up in NAV from time to time. Hopefully MS will resolve this or make a change in a future version (as I'm guessing this is caused at a fairly low level of the executable).

    So that really only leaves the back door option of accessing the field directly by SQL as a means to read/write to this field, which still may cause corruption problems when loading the record into memory...yikes.
    Jeff Landeen - Sr. Consultant
    Epimatic Corp.

    http://www.epimatic.com
Sign In or Register to comment.