Error when using ADO to insert blob into SQL Server linked table (1 Viewer)

harmankardon

Registered User.
Local time
Today, 06:03
Joined
Aug 8, 2011
Messages
71
Hi all,

I am trying to insert a blob into a SQL Server linked table field and with files over roughly 150MB I get the following error:

"not enough storage is available to complete this operation."

The SQL Server field is varbinary(max) datatype so it should be able to handle up to 2GB.

So far I've tried ADO AppendChunk using an ADO Recordset, and I've also tried doing INSERT INTO using ADO Command and parameter of type adVarBinary (using ADO Stream.Read to load binary data into parameter).

Both methods work great with smaller files, but anything around 150MB or larger fails with the error mentioned above.

I'm convinced this is some internal issue with ADO (tried both 2.8 and 6.1 libraries), but I'm not sure. I'm using a powerhouse of a computer and the server has plenty of resources so I doubt it's a hardware issue.

Has anyone been able to insert a blob larger than 150MB from an Access front-end into SQL Server?

EDIT: I mistakenly put "linked table" in the title. I am not using a linked table for this, I am creating a separate ADO connection directly to the SQL Server database from the MS Access front-end.
 
Last edited:

Minty

AWF VIP
Local time
Today, 09:03
Joined
Jul 26, 2013
Messages
10,355
What is your SQLserver compatibility mode set to?
 

harmankardon

Registered User.
Local time
Today, 06:03
Joined
Aug 8, 2011
Messages
71
Hi Minty, it is currently set to "SQL Server 2000 (80)"
 

arnelgp

..forever waiting... waiting for jellybean!
Local time
Today, 17:03
Joined
May 7, 2009
Messages
19,170
this was from a discussion i copied on the net, I hope it will give you some
light of what to do and what FieldType to use in the SQL backend.
Code:
'Verified Answer
'
'by:NicoboPosted on 2011-04-18 at 03:11:39ID: 35414889
'I created a table in SQL Server with columns ID, Description (varchar), FS (filestream). 
'Then I created this module in an Access database. (Set a reference to 'Microsoft ActiveX 
'Data Objects 2.8' for this code to work):

Sub AddFile(strDescription As String, strFilename As String)
    Dim con As New ADODB.Connection
    Dim rs As New ADODB.Recordset
    Dim objStream As New ADODB.Stream
    
    'create the stream
    objStream.Type = adTypeBinary
    objStream.Open
    objStream.LoadFromFile strFilename
    
    con.Open "PROVIDER=SQLOLEDB.1;PERSIST SECURITY INFO=FALSE;INITIAL CATALOG=FS;DATA SOURCE=(local)\sqlexpress2008r2;INTEGRATED SECURITY=SSPI"
    rs.Open "Select * from FSTest", con, adOpenDynamic, adLockOptimistic
    rs.AddNew
    rs!Description = strDescription
    rs!FS = objStream.Read
    rs.Update
    rs.Close
    objStream.Close
    con.Close
End Sub

Sub ReadFile(strDescription As String, strFilename As String)
    Dim con As New ADODB.Connection
    Dim rs As New ADODB.Recordset
    Dim objStream As New ADODB.Stream
    
    'create the stream
    objStream.Type = adTypeBinary
    objStream.Open
    
    con.Open "PROVIDER=SQLOLEDB.1;PERSIST SECURITY INFO=FALSE;INITIAL CATALOG=FS;DATA SOURCE=(local)\sqlexpress2008r2;INTEGRATED SECURITY=SSPI"
    rs.Open "Select * from FSTest where Description = '" & strDescription & "'", con, adOpenDynamic, adLockOptimistic
    If Not rs.EOF Then
        objStream.Write rs!FS
        objStream.SaveToFile strFilename, adSaveCreateOverWrite
        objStream.Close
    End If
    rs.Close
    con.Close
End Sub


'Now I can use this statement to store a file  in the table:
'AddFile "Test1","G:\Data\EE\Test.zip"
'And this statement to retrieve the file from the table:
'ReadFile "Test1","G:\Data\EE\Test2.zip"
 

harmankardon

Registered User.
Local time
Today, 06:03
Joined
Aug 8, 2011
Messages
71
Hey arnelgp, thanks for sharing. I already have code that works great for files under 150MB. One of my test functions looks pretty much exactly like yours, the only notable difference is that in my connection string I have:

Code:
Driver=SQL Server

and you have:

Code:
PROVIDER=SQLOLEDB.1

Could you tell me what size files you've uploaded with your code?
 

arnelgp

..forever waiting... waiting for jellybean!
Local time
Today, 17:03
Joined
May 7, 2009
Messages
19,170
Ino i havent tried anything else yet, just shared the post.
 

Minty

AWF VIP
Local time
Today, 09:03
Joined
Jul 26, 2013
Messages
10,355
Hi Minty, it is currently set to "SQL Server 2000 (80)"

Try changing to it 120 as the current setting will restrict the max size to either 4000 or 8000 bytes. The Max() 2gb limit was introduced in either 2005 or 2008 SQL.

2000 doesn't have the same compatibility.
 

harmankardon

Registered User.
Local time
Today, 06:03
Joined
Aug 8, 2011
Messages
71
I tried changing the compatibility to SQL Server 2008 (100) as this was the highest I could go (I'm running SQL Server 2008). It did not seem to make a difference unfortunately.

I was able to finally have some success by changing some of the client side stuff, but I don't really understand why it works now.

Here's a snippet of the code that is working:

Code:
Dim cn As New ADODB.Connection
Dim rs As New ADODB.Recordset

            cn.ConnectionString = "Provider=sqloledb.1;Server=SQL2008\TESTENVIRO;Database=TestDatabase;Integrated Security=SSPI;"
            cn.Open
            
        'Loop through each file selected
            For Each varFile In .SelectedItems
  
                'Hour glass
                Screen.MousePointer = 11
    
                strFilePath = varFile
                strFileName = GetFilenameFromPath(strFilePath)
   
                With rs
                    .ActiveConnection = cn
                    .source = "SELECT * FROM [MDL_UNCTRLD_BLOB_VIEW] WHERE Blob_ID = 0"
                    .CursorType = adOpenForwardOnly
                    .LockType = adLockOptimistic
                    .CursorLocation = adUseClient
                    .Open
                End With

                'add new record with our values,
                rs.AddNew
                    rs.Fields(strParentIdField).value = lngParentID
                    rs.Fields("FileName").value = strFileName
                    rs.Fields("FileDateTime").value = Now()

                    Call FileToBlob(rs.Fields(strBlobFieldName), strFilePath)

                    If Not IsMissing(blobType) Then
                        rs.Fields("Blob_Type") = blobType
                    End If

                rs.Update

                'Get new Blob ID in case this is a file replacement
                newReplaceID = rs.Fields("Blob_ID")
                rs.Close
                DoEvents
            Next

And here is the FileToBlob function:

Code:
Public Sub FileToBlob(fld As ADODB.Field, strFilePath As String, Optional ChunkSize As Long = 1048576)
    Dim fnum As Integer, bytesLeft As Long, bytes As Long
    Dim tmp() As Byte
    Dim RetVal As Variant
    Dim lngFileLen As Long
    
    Screen.MousePointer = 11
    
    ' Raise an error if the field doesn't support GetChunk.
    If (fld.Attributes And adFldLong) = 0 Then
        Err.Raise 1001, , "Field doesn't support the GetChunk method."
    End If
    
    ' Open the file; raise an error if the file doesn't exist.
    If Dir$(strFilePath) = "" Then Err.Raise 53, , "File not found"
    
    fnum = FreeFile
    Open strFilePath For Binary As fnum
    On Error GoTo CloseFile
    
    ' Read the file in chunks, and append data to the field.
    lngFileLen = FileLen(strFilePath)
    ChunkSize = lngFileLen / 100
    
    RetVal = SysCmd(acSysCmdInitMeter, _
    "Uploading file: " & strFilePath, lngFileLen)
    
    bytesLeft = LOF(fnum)
    
    Do While bytesLeft
        bytes = bytesLeft
        If bytes > ChunkSize Then bytes = ChunkSize
        ReDim tmp(1 To bytes) As Byte
        Get #1, , tmp
        fld.AppendChunk tmp
        RetVal = SysCmd(acSysCmdUpdateMeter, lngFileLen - bytesLeft)
        DoEvents
        bytesLeft = bytesLeft - bytes
    Loop
    
    RetVal = SysCmd(acSysCmdRemoveMeter)
    Close fnum
    
ExitProcedure:
    Screen.MousePointer = 0
    Close #fnum
    Exit Sub
    
errHandler:
    'File open or in use, resume
    Resume ExitProcedure
    
CloseFile:
    MsgBox "Could not finish file to blob.", vbOKOnly + vbExclamation, "Test Database"
    Close #fnum
    Resume ExitProcedure
    
End Sub

What I changed to make this work was the following:

Originally the connection string was
Code:
"Driver=SQL Server;Server=SQL2008\TESTENVIRO;Database=TestDatabase;Trusted_Connection=True;"

then I changed it to this:

Code:
"Provider=sqloledb.1;Server=SQL2008\TESTENVIRO;Database=TestDatabase;Integrated Security=SSPI;"

And I also changed the ADODB Recordset cursor location from
Code:
.CursorLocation = adUseServer
to
Code:
.CursorLocation = adUseClient

This was interesting because adUseClient seemed to fail anytime I tried it with "Driver=SQL Server" in the connection string, but it seems to work very well if I replace that part of the connection string with "Provider=sqloledb.1" (thanks to arnelgp for the inspiration even though I have no idea why it works!).

If someone was interested I could post the methods that failed (using ADODB Stream object, using ADODB Command object with parameters, etc...)

I really don't like using adUseClient (especially because it works so horribly with any of the SQL Server drivers), but I can't afford to be picky.

I haven't taken the time to really cleanup this code yet so I apologize for the mess. I originally wrote this code as a student back in 2011 (I actually found posts from myself on this exact same issue back then) and have been working as an embedded C developer since, but this code is still in use in a system being used daily by quite a few people in the company.
 

Minty

AWF VIP
Local time
Today, 09:03
Joined
Jul 26, 2013
Messages
10,355
Interesting. I suspect the ODBC SQL driver is the issue.
I think I recently found that certain ADO functions didn't work or were "flaky" with the original {SQL Driver} , and that installing and switching to the

DRIVER={SQL Server Native Client 11.0};

Driver seemed to work better.
 

harmankardon

Registered User.
Local time
Today, 06:03
Joined
Aug 8, 2011
Messages
71
Well unfortunately I ran into an issue. Although this code can handle larger files, it seems that using the sqloledb provider instead of the SQL Server driver causes issues with tables vs views in SQL Server.

I use views to implement permissions in SQL Server. The underlying table will be locked out for regular users, but I have views of that table that only return a subset of records and that view will have edit/add permissions to allow users to edit only those records or add new ones.

This code I have is supposed to write the blob to the view instead of the underlying table which works fine with SQL Server (native) driver, but when I switch to using the sqloledb provider, it tries to write directly to the underlying table even though the recordset source is set to the view (confirmed via debugging).

Arg.
 

harmankardon

Registered User.
Local time
Today, 06:03
Joined
Aug 8, 2011
Messages
71
This problem is still haunting me so I'm going to bump it with the hopes someone else may have a similar experience.

The problem is either I can't upload blobs over 150MB from Access front-end to SQL Server via ADO but at least the code recognizes views vs tables correctly (my permissions model), or I can upload blobs over 150MB but the method does not seem to understand views and tries to write to the underlying table (thus breaking my permissions model).
 

Minty

AWF VIP
Local time
Today, 09:03
Joined
Jul 26, 2013
Messages
10,355
Total shot in the dark but have you tried using different permission levels ?
 

harmankardon

Registered User.
Local time
Today, 06:03
Joined
Aug 8, 2011
Messages
71
Total shot in the dark but have you tried using different permission levels ?

Poor Minty, I feel like I've been bugging you about this for a long time lol.

I have tried different permission levels, no luck. I'm starting to think there's really no hope unless I A) Stop using MS Access and build my own app or B) Try using a different version of SQL Server.

Neither of those are viable options unfortunately.

I think I can work around the table/view issue with SQLOLEDB.1 for now, but it is frustrating that everything about this feels like a hack. I'm an embedded C developer by trade and dealing with all this Microsoft stuff reminds me why I made that career choice :)
 

Minty

AWF VIP
Local time
Today, 09:03
Joined
Jul 26, 2013
Messages
10,355
No problem, I wouldn't call myself a SQL expert by any stretch, but do use it daily, and the versoning has bitten me on the backside a few times.

Is this someone else's SQL Server I guess, you can't upgrade using a later express version ?

What about a total hack, like writing the blob to a temp SQL Table that the permissions issue isn't a problem, then moving from that temp table to the correct place?
(Apologies for the "hacky" nature of this suggestion - I've just had to bodge something similar to get round a silly permission issue....)
 

harmankardon

Registered User.
Local time
Today, 06:03
Joined
Aug 8, 2011
Messages
71
No problem, I wouldn't call myself a SQL expert by any stretch, but do use it daily, and the versoning has bitten me on the backside a few times.

Is this someone else's SQL Server I guess, you can't upgrade using a later express version ?

What about a total hack, like writing the blob to a temp SQL Table that the permissions issue isn't a problem, then moving from that temp table to the correct place?
(Apologies for the "hacky" nature of this suggestion - I've just had to bodge something similar to get round a silly permission issue....)

Yeah, a shared server unfortunately.

No need to apologize for the hack, I am definitely not above hacking! That is actually a pretty good idea, going to look into that, thanks for the inspiration.

Someday I dream of getting away from putting blobs in the database entirely and just do links to file-system, or maybe upgrade SQL Server and try that new FileStream/FileTable stuff I've read about.
 

Minty

AWF VIP
Local time
Today, 09:03
Joined
Jul 26, 2013
Messages
10,355
Good Luck.
Filestream is pretty cool , although I've only played briefly with it for a signature capture thing I was working on.
 

Mark_

Longboard on the internet
Local time
Today, 02:03
Joined
Sep 12, 2017
Messages
2,111
Very odd query,

What is the blob you are trying to save? Is it possible to save to disk and keep a file path instead?
 

sonic8

AWF VIP
Local time
Today, 10:03
Joined
Oct 27, 2015
Messages
998
... but the method does not seem to understand views and tries to write to the underlying table (thus breaking my permissions model).
Use the WITH VIEW_METADATA option when creating the view. This should fix that part of the issue.
 

harmankardon

Registered User.
Local time
Today, 06:03
Joined
Aug 8, 2011
Messages
71
Very odd query,

What is the blob you are trying to save? Is it possible to save to disk and keep a file path instead?

Most of the large files are large design files for a 3D CAD system. If I had the time I would definitely switch to the method you mentioned.
 

harmankardon

Registered User.
Local time
Today, 06:03
Joined
Aug 8, 2011
Messages
71
Use the WITH VIEW_METADATA option when creating the view. This should fix that part of the issue.

Hmm, this looks very promising and seems to explain exactly the trouble I'm having when I use the SQLOLEDB driver. Thanks for this tip, going to give this a shot today hopefully.
 

Users who are viewing this thread

Top Bottom