OpenQuery and TransferSpreadsheet?

Gasman

Enthusiastic Amateur
Local time
Today, 23:32
Joined
Sep 21, 2011
Messages
16,393
Hi all,
Trying to help a member on another forum, where he is complaining about slowness for creating 10 sheets i an excel file for over 300 customers.
First he is opening the query then TransferSpreadsheet?

I do not know if the Transfer takes into account that the query supporting it is already open, or opens a new copy?

Is anyone able to advise to that please?
 
I open an excel object, use copyfromrecordset, use excel automation to format the worksheet rows and columns and add functionality such as filtering. Repeats for 3 further worksheets then saves. Largest worksheet has around 5000 rows.

Takes less than 10 seconds from pressing the button to saving the file
 
Why are they opening query?
How many records for 300 customers?

Does Transfer pull from open object? I don't think so.
 
Why are they opening query?
How many records for 300 customers?

Does Transfer pull from open object? I don't think so.
No idea TBH. I have told them that the Transfer will do that, and then mentioned the above subject, but said I am not sure about it, which is why I asked here.
 
First he is opening the query then TransferSpreadsheet
No need to open the query, transferspreadsheet does that anyway

Not clear if the requirement is for 300 x 10 sheet workbooks or one workbook with 10 sheets x 300 rows. If the former my method would probably take around 10 minutes, the latter around 15 seconds
 
Initially, it currently takes approx 1 minute per file to generate. Very Acceptable!! :)

Unfortunately, after around 500'ish files are generated, the process tends to start lagging. Taking 2,3,4 sometimes 5 minutes a file to generate. Before I just replace the workstation with a possibly faster processor, and try that, I am wondering if the current commands I am using are still "up to date" or if there is a better / more efficient way to do this.

I am basically using DoCmd.OpenQuery and DoCmd.TransferSpreadsheet acExport, acSpreadsheetTypeExcel8 commands to accomplish this.

I do have sleep commands between each OpenQuery command. So after running the Customer Address query, capturing that data and exporting to the template, Sleep (2000), and then start running the Customer Purchases query for example. I believe I added this due to getting a customer id template.xls file already in use type message in the past.
 
All slow methods relatively, plus Excel8? A 2 second Sleep?

Slowing down after 500 odd reports? Might be due to poor coding, old vba version. hardware issues. Wouldn’t like to speculate beyond the addition of the sleep simply masks the issue
 
I've experienced a slow down with repeated calls to Excel, despite lots of efforts to force it to "tidy up"
I can't honestly say I've tried to create 300 workbooks in one hit though.

I do produce time sheets for up to about 100 staff in one app, and that gradually slows down to a snails pace if you try and do more than about 50 of them in one go.

There is a definite improvement in performance with a freshly rebooted PC, so there is obviously resources not being freed up somewhere. I've had similar experiences with PowerPoint file creation and manipulation.
 
I've created about 30 spreadsheets in one export to produce a separate commission list for each sales person. Each export had about 40 rows. It probably took a minute or so to run.

I'm going to guess that 300 could be slow using TransferSpreadsheet since each time you run the TransferSpreadsheet, you are opening an instance of Excel. That is what is talking the time - opening and closing Excel. For 300, I might try opening excel then use a loop that creates a new workbook, runs the copyfromrecordset and closes the workbook. This should take less time for each workbook than also opening/closing excel. Please let us know the result.
 
Thank you Pat. I have suggested CopyFromRecordset. O/P seems reluctant to do so as it invoves Excel automation.
I will post your comments and see if I can persuade them.
 
Perhaps the variant via ADODB runs faster.

Tested with Northwind 2: 29 tables .. export time: ~1 sec
Code:
Private Sub ExportTablesToExcelWithADODB()

    Dim ExcelFile As String
 
    Dim ConnectionString As String
    Dim cnn As ADODB.Connection
    Dim InsertIntoSql As String
 
    ExcelFile = CurrentProject.Path & "\TestFile" & Format(Now(), "_yymmddhhnnss") & ".xlsx"
    ConnectionString = "Provider=Microsoft.ACE.OLEDB.12.0;Data Source=" & ExcelFile & ";Extended Properties=""Excel 12.0 Xml;HDR=YES"";"

    Set cnn = New ADODB.Connection
    cnn.Open ConnectionString
 
    Dim db As DAO.Database
    Dim tdf As DAO.TableDef
 
    Set db = CurrentDb
    For Each tdf In db.TableDefs
        If (tdf.Attributes And dbSystemObject) = 0 Then
            InsertIntoSql = "Select * INTO [" & tdf.Name & "] FROM [MS Access;DATABASE=" & CurrentProject.FullName & "].[" & tdf.Name & "]"
            cnn.Execute InsertIntoSql
        End If
    Next

    cnn.Close

End Sub

/edit: loop 300x => export time: 2 min 20 sec ;)
 
Last edited:
Thank you @Josef P. I will pass it on.
Not sure if anyaone here is on that forum, but here is the link regardless. Saves me being the middleman? :)

Not sure if that is going to give a sheet for each query though?
 
Quick update. The queries are make table queries for some reason? No idea as to why, trying to find out.
So the transfers were not queries but tables with the same name as the queries. :(
 
It is quite likely that the OP you are helping doesn't know that queries and tables are interchangeable in most cases. This being an important one. Not only is he incurring the overhead of the open/close of Excel but of the make table as well, not to mention all the bloat.
 
Not clear if the requirement is for 300 x 10 sheet workbooks or one workbook with 10 sheets x 300 rows.

The former.

So... creating 300 WORKBOOKS - each of which is a distinct file. They are created in a loop so one file at a time. Are the files created locally or on a shared drive? For each file you do these things in order:

1. Check the destination folder for permissions to diddle with the folder and its contents.
2. Check the destination folder for name "collision" (even though I'm sure there won't be.)
3. Tentatively create a temporary file with the 10 worksheets. Prepare it. Finalize it.
4. Rename the temp file to its final name.
5. Wash, rinse, repeat.

The problem in #3 is that you have to allocate a file handle in order for the file system to create a file. Excel has to ask Windows for a new file handle. At this point, if you have opened and closed enough files, Windows garbage collection kicks in because you CAN re-use a file handle - but Windows tries to give you a new handle until you run out of the pre-allocated set (pool) of handles. Here's the ugly part... if the file is local to the machine running Excel, file management is run at memory-bus speeds. If the file is on a shared folder, the machine hosting the shared folder is the manager and all file management is run at NETWORK speeds. If you have a Gigabit Ethernet, it is still a serial communications operation, whereas if it is in local memory, the memory bus is parallel. Even if you had a 1 GHz CPU and a 1 GHz Ethernet, the network is a minimum of 8 times slower, and might even be worse. And each of my steps 1, 2, 3, and 4 would be affected by network vs. local speed issues.
 

Users who are viewing this thread

Back
Top Bottom