Linking Excel File and Editable (1 Viewer)

Weekleyba

Registered User.
Local time
Today, 17:25
Joined
Oct 10, 2013
Messages
586
I thought I had this figured out but... nope.
I have a small database that is split and used by multiple users for reports only. No input into forms.
The only input is from a single Excel file that is generated from another online program.
I want a way to import the the new Excel file into the Access db and be able to replace it when a new Excel file is generated.

What is the best way to set this up?

My thought was to create a link to an Excel csv file or xlsx, using the Import Excel or Import Text File.
The save this import specification and run it via VBA through the click of a but on the main form.

The problem with the Excel csv file is that some of the columns have commas integrated in the cell. This throws everything off and the data is useless.
The problem with the Excel xlsx file is that Access will select the Number format of some fields that actually have alphanumeric when it imports them.
The problem with the Text File import is that I was not able to delete the records in the text file when I import the new file.

What are some good ways of setting this up?
 

The_Doc_Man

Immoderate Moderator
Staff member
Local time
Today, 17:25
Joined
Feb 28, 2001
Messages
27,189
First, if I read this right, you are doing something that does a direct import to overlay a previously existing table. There is another path, though. You can define the table and import into an existing table for which the correct field & format information is already defined. That might handle at least forcing the right interpretation of your fields.

Second, that bit about importing a text file but being unable to delete the records makes me wonder if you tried to MAP the Excel text file rather than actually import it, because the "cannot delete" behavior is characteristic of an external Excel table.

Depending on the number of fields involved (or columns in the spreadsheet), but NOT depending on the number of rows, it might be better to go ahead and import things, but then have the REAL table as an actual Access table. Then write a delete query of the real contents. Then write an INSERT INTO query selecting the data in the appropriate formats - which means you do one Excel conversion per new file plus two more queries. Then you can manipulate the "raw" table by unmapping it.

If this makes sense to you and you want to pursue it, you can ask for more advice, but I won't bore you if this doesn't make sense or sounds like something you don't want to do.
 

June7

AWF VIP
Local time
Today, 14:25
Joined
Mar 9, 2014
Messages
5,474
Is this Excel file holding cumulative data or only new records that you need to add into local table?

Do you want to link or import?
 

Pat Hartman

Super Moderator
Staff member
Local time
Today, 18:25
Joined
Feb 19, 2002
Messages
43,289
At some level it also matters whether one person will be responsible for importing from Excel. It is a lot more complicated when you are building something that will be handled by a non-technical person.

The simplest solution, assuming that you will always import the data is to create a template BE. In this database will be only table definitions. Then when you are ready to replace the BE, you copy the template to your local drive and open it. Import the excel data to the table. Then make sure that no one has the BE open. Delete it (or archive it). Copy the updated version to the Server.

If the import is more complicated, you should probably start with linking the spreadsheet and then running append queries.

You can build a form that will automate all this and you probably should, just to help yourself.
 

Weekleyba

Registered User.
Local time
Today, 17:25
Joined
Oct 10, 2013
Messages
586
Thank you guys for all you input. I've been working diligently on this problem and I'm close.
Here is what I've come up with.

I download the data from the other program into an Excel xlsx file.
I then import that xlsx file into the back end database from within the back end database.
Of course I'd need to do this when no one is using the front end, which shouldn't be a problem for me.

So the back end looks like this with the following code on the "Import New Data" button.

1662754410275.png



Code:
Private Sub cmdImport_Click()
    Dim Msg1, Msg2, Msg3, Response1, Response2, Style1, Style2, Title1, Title2, Title3, Style3, Response3, fName As String
    
    fName = "W:\DFM_Databases\IHS - Allowance Status by Project and Location.xlsx"
    
    Msg1 = "Are you sure you want to IMPORT the Allowance Status Report?"
    Style1 = vbYesNo + vbCritical + vbDefaultButton2
    Title1 = "IMPORT"
    
    Msg2 = "Worksheet imported!"
    Style2 = vbOKOnly + vbInformation
    Title2 = "EXCEL FILE IMPORTED"
    
    Msg3 = "Cannont find the file:" & vbCrLf & "'IHS - Allowance Status by Project and Location.xlsx'" & vbCrLf & " " & vbCrLf & "Please make sure that the Excel xlsx file is named correctly and in the correct location."
    Style3 = vbOKOnly + vbCritical
    Title3 = "ERROR - NO FILE FOUND"
    
    Response1 = MsgBox(Msg1, Style1, Title1)
    If Response1 = vbNo Then
        Exit Sub
    ElseIf Response1 = vbYes Then
    
       'On Error GoTo SubError
    
       'This checks if the file exists.
       'FileExists is a public function shown below
          'Public Function FileExists(ByVal path_ As String) As Boolean
              'FileExists = (Len(Dir(path_)) > 0)
          'End Function
       Const workFolder As String = "W:\DFM_Databases\IHS - Allowance Status by Project and Location.xlsx"
       If FileExists(workFolder) Then
          Else
             Response3 = MsgBox(Msg3, Style3, Title3)
          Exit Sub
       End If
    
       ' This runs the saved Import under the External Data tab where I changed the data type for the BAP column to short text.
       DoCmd.RunSavedImportExport "Import-IHS - Allowance Status by Project and Location"
                  
       Response2 = MsgBox(Msg2, Style2, Title1)
    
   End If
End Sub

The front end is then just linked to the back end table that was imported. Namely, "IHS - Allowance Status by Project and Location".

It was working nicely and then.... when clicking the "Import New Data" button, it appears to close Access and brings up the screen below.
After a 3 or 4 mins, it reopens the database as if you had opened it from the file. Then when you click on either button it opens a menu to save a backup of the database. See below.
If you save a backup, then takes another 3 or 4 mins to the main form again.

I've managed to compact and repair a few times but, maybe there's something else going on that I am not aware of.
Any advice?


1662755366068.png


1662755517840.png
 

Weekleyba

Registered User.
Local time
Today, 17:25
Joined
Oct 10, 2013
Messages
586
I should mention that the back end is on separate server and not on my PC.
The speed is definitely slower on the server but, I have other databases that work fine on it.
That said, it may be part of the issue. Not sure.
 

Gasman

Enthusiastic Amateur
Local time
Today, 23:25
Joined
Sep 21, 2011
Messages
14,310
That is Access crashing. I have that happening now on my Test DB when I try to create a form for example. :(
 

CarlettoFed

Member
Local time
Tomorrow, 00:25
Joined
Jun 10, 2020
Messages
119
Meanwhile, you can try to simulate the import by working with all the files on your computer.
If everything is correct in this case the problem is with the server.
However, if you provide the sample files we could also check what happens.
 

Gasman

Enthusiastic Amateur
Local time
Today, 23:25
Joined
Sep 21, 2011
Messages
14,310
BTW you are not dimming your variables correctly. :(
Only the last is actually a string type variable?
 

Weekleyba

Registered User.
Local time
Today, 17:25
Joined
Oct 10, 2013
Messages
586
BTW you are not dimming your variables correctly. :(
Only the last is actually a string type variable?
If I simply remove the "As String" at the end, does VBA consider them variables?
Would that cause any issues? It doesn't seem to in this case.
 

Weekleyba

Registered User.
Local time
Today, 17:25
Joined
Oct 10, 2013
Messages
586
There's some funny business going on here.
I copied the back end database from the server and placed it on my desktop.
It ran fine. Multiple attempts, no problem.
I then deleted the one on the server and copy pasted the one from the desktop to server.
Opening the one on the server now it seems to work fine also. Not sure why???

One issue I do have is the database bloats.
After compacting, it's 24MB. Click the Import button, and now the database is 48MB. Click again and it's 60MB. Again and it's 87MB.
Compact it and it's back to 24MB.
Do I just live with that, or am I doing some wrong here?
 

The_Doc_Man

Immoderate Moderator
Staff member
Local time
Today, 17:25
Joined
Feb 28, 2001
Messages
27,189
If I simply remove the "As String" at the end, does VBA consider them variables?

Anything DIM'd is a variable. (Unless it is an array, in which case it is many variables.) Even a datatype of Object creates a variable - a pointer to the named object type. If you leave off the "As <datatype>" sub-clause, then it is a VARIANT data type, which is faceless. That means if you stored an incorrect data type, the variant won't complain - because it can take anything, even the wrong thing. I.e. no "Type Mismatch" errors - which is actually a bad condition because such error messages help with debugging.

EVERY variable name has to have an "As <datatype>" sub-clause individually. E.g. DIM X As LONG,Y,Z As String causes Y to be a VARIANT.
 

The_Doc_Man

Immoderate Moderator
Staff member
Local time
Today, 17:25
Joined
Feb 28, 2001
Messages
27,189
There's some funny business going on here.
I copied the back end database from the server and placed it on my desktop.
It ran fine. Multiple attempts, no problem.
I then deleted the one on the server and copy pasted the one from the desktop to server.
Opening the one on the server now it seems to work fine also. Not sure why???

One issue I do have is the database bloats.
After compacting, it's 24MB. Click the Import button, and now the database is 48MB. Click again and it's 60MB. Again and it's 87MB.
Compact it and it's back to 24MB.
Do I just live with that, or am I doing some wrong here?

You are doing something wrong, of sorts. It has to do with the operational design of your DB.

"Bloat" occurs in a database file when you delete something or update something. There is a complex explanation for it, having to do with "garbage collection" and the fact that scratchpad memory pools can be randomly allocated leading to fragmented memory. Also related to the idea that you can build an update transaction and then have to roll it back, which means that for a moment, you have two copies of some things, only one of which should remain when done. So you delete a whole table and replace it. The space of that section of the table you deleted cannot be reclaimed and reused without a manual Compact & Repair. In your case, every import eats 24 MB worth of table.

Take a really good look at Pat Hartman's post #4. Her idea is simple but I'll break it down for you. First, create a database with table structures, field definitions, etc. - but no data. Everything is structured but empty. It has also been C&R'd so no bloat. This is the "template" she mentioned.

Design things so that the data coming from the spreadsheet will go to its own special Access BE file. It IS possible to have more than one back-end, so you toss the imported spreadsheet into its own little private BE file. If you have persistent data that survives an import, you put that in a more permanent BE file that will retain its content.

When you have to do an import, delete the special BE file, do a file copy of the template file to create a new special BE file - because for native Access, mapping is done via file name. Now dynamically map the tables. Run that import to load the predefined table structure(s). Because you did not delete prior content, you didn't leave behind a fragmented memory. It counts as the "initial load" of that data. If you delete, regenerate by copy, and then load the (new) copy, you never have bloat.

The only complication is the actual import operation. You just don't want your users to accidentally trigger the erase/rebuild cycle every time they open the DB for their use.
 

Weekleyba

Registered User.
Local time
Today, 17:25
Joined
Oct 10, 2013
Messages
586
Anything DIM'd is a variable. (Unless it is an array, in which case it is many variables.) Even a datatype of Object creates a variable - a pointer to the named object type. If you leave off the "As <datatype>" sub-clause, then it is a VARIANT data type, which is faceless. That means if you stored an incorrect data type, the variant won't complain - because it can take anything, even the wrong thing. I.e. no "Type Mismatch" errors - which is actually a bad condition because such error messages help with debugging.

EVERY variable name has to have an "As <datatype>" sub-clause individually. E.g. DIM X As LONG,Y,Z As String causes Y to be a VARIANT.
Sorry. I meant to write variant. Thanks for the explanation. It stinks that you need to write the sub-clause individually.
 

Weekleyba

Registered User.
Local time
Today, 17:25
Joined
Oct 10, 2013
Messages
586
You are doing something wrong, of sorts. It has to do with the operational design of your DB.

"Bloat" occurs in a database file when you delete something or update something. There is a complex explanation for it, having to do with "garbage collection" and the fact that scratchpad memory pools can be randomly allocated leading to fragmented memory. Also related to the idea that you can build an update transaction and then have to roll it back, which means that for a moment, you have two copies of some things, only one of which should remain when done. So you delete a whole table and replace it. The space of that section of the table you deleted cannot be reclaimed and reused without a manual Compact & Repair. In your case, every import eats 24 MB worth of table.

Take a really good look at Pat Hartman's post #4. Her idea is simple but I'll break it down for you. First, create a database with table structures, field definitions, etc. - but no data. Everything is structured but empty. It has also been C&R'd so no bloat. This is the "template" she mentioned.

Design things so that the data coming from the spreadsheet will go to its own special Access BE file. It IS possible to have more than one back-end, so you toss the imported spreadsheet into its own little private BE file. If you have persistent data that survives an import, you put that in a more permanent BE file that will retain its content.

When you have to do an import, delete the special BE file, do a file copy of the template file to create a new special BE file - because for native Access, mapping is done via file name. Now dynamically map the tables. Run that import to load the predefined table structure(s). Because you did not delete prior content, you didn't leave behind a fragmented memory. It counts as the "initial load" of that data. If you delete, regenerate by copy, and then load the (new) copy, you never have bloat.

The only complication is the actual import operation. You just don't want your users to accidentally trigger the erase/rebuild cycle every time they open the DB for their use.
The import only goes to one table and that table, after multiple imports, remains at the same number of records. So, it's probably as you said, memory pool.
If in my code when I click the "Import New Data" button I check the file is there, and then delete the table with a delete query prior to creating the table, would that work?
I haven't tried that yet but will.
I'm guessing not since both you an Pat recommend a template. I can totally do that but, I'm stubbornly looking for a better long term solution.
 

Weekleyba

Registered User.
Local time
Today, 17:25
Joined
Oct 10, 2013
Messages
586
First, if I read this right, you are doing something that does a direct import to overlay a previously existing table. There is another path, though. You can define the table and import into an existing table for which the correct field & format information is already defined. That might handle at least forcing the right interpretation of your fields.

Second, that bit about importing a text file but being unable to delete the records makes me wonder if you tried to MAP the Excel text file rather than actually import it, because the "cannot delete" behavior is characteristic of an external Excel table.

Depending on the number of fields involved (or columns in the spreadsheet), but NOT depending on the number of rows, it might be better to go ahead and import things, but then have the REAL table as an actual Access table. Then write a delete query of the real contents. Then write an INSERT INTO query selecting the data in the appropriate formats - which means you do one Excel conversion per new file plus two more queries. Then you can manipulate the "raw" table by unmapping it.

If this makes sense to you and you want to pursue it, you can ask for more advice, but I won't bore you if this doesn't make sense or sounds like something you don't want to do.
I think I'll try something like you suggest here. Couldn't I simply run the delete query prior to the DoCmd.RunSavedImportExport "Import-IHS - Allowance Status by Project and Location"? I need to try that.
 

The_Doc_Man

Immoderate Moderator
Staff member
Local time
Today, 17:25
Joined
Feb 28, 2001
Messages
27,189
Running a delete query is the direct cause of the bloat. SQL does not garbage-collect its memory space because to do so, it would have to run the equivalent of a C&R to find all of those deleted records. That "dynamic" C&R operation requires exclusive access to the DB. It is usually NOT the import operation that causes bloat - it is the preceding step to get old obsolete data out of the way while the DB is still running. If you had instead chosen to update your table from the new data by building VBA to read the Excel file line-by-line, you would ALSO get serious bloat.

Notice that earlier I didn't suggest your table structure was bad. I said you had a bad OPERATIONAL structure. The way you are dealing with that external data source, the way you have to operate on the data, is what I meant by that. By replacing the whole table, you make the entire prior contents of that table obsolete. The "straight-through" method of deleting what you don't want then importing what you do want is your problem. You need the subtlety of the "template" approach to not cause bloat. Since your discussion suggests that you are the only person who would do this, just doing it by hand would work. However, if you program it you would have more reliability.

Here is a starting point. Look up and carefully explore this topic: FileSystemObject.


That link gets you to the Microsoft "Docs" site. Using the tree-like structure on the left of that page, you can drill down to explore topics including how to do file copy operations and file delete operations programmatically. VBA and the "FSO" work together very well and in fact, for what you need, VERY simply.

I'm stubbornly looking for a better long term solution.

Actually I understand that, and have been known to be a "bull in a china shop" looking for a better way - and breaking a bunch of things along that path to the better way. Pat and I (as well as many others in the forum) have LOTS of experience with bloat. The problem with bloat isn't fixing it. It is fixing it in a way that doesn't disrupt others and/or in a way that doesn't tie up the database for a very long time trying to unravel everything and/or in a way that doesn't leave a mess behind you.

If you are a student of ancient history, you will have heard of Alexander the Great and his solution for untying the Gordian Knot. His solution was to take out his sword and cut it apart. Well, the C&R method is analogous. You don't untangle the blocks of deleted data. Instead, you make a new empty file and then extract everything you wanted to keep into that new file, then delete the old file in its entirety. (That IS the way a Compact & Repair operates!) The "Template" method is simply a way to selectively delete the old pile of obsolete data, create a new file, and re-tie the knot.
 

Weekleyba

Registered User.
Local time
Today, 17:25
Joined
Oct 10, 2013
Messages
586
Running a delete query is the direct cause of the bloat. SQL does not garbage-collect its memory space because to do so, it would have to run the equivalent of a C&R to find all of those deleted records. That "dynamic" C&R operation requires exclusive access to the DB. It is usually NOT the import operation that causes bloat - it is the preceding step to get old obsolete data out of the way while the DB is still running. If you had instead chosen to update your table from the new data by building VBA to read the Excel file line-by-line, you would ALSO get serious bloat.

Notice that earlier I didn't suggest your table structure was bad. I said you had a bad OPERATIONAL structure. The way you are dealing with that external data source, the way you have to operate on the data, is what I meant by that. By replacing the whole table, you make the entire prior contents of that table obsolete. The "straight-through" method of deleting what you don't want then importing what you do want is your problem. You need the subtlety of the "template" approach to not cause bloat. Since your discussion suggests that you are the only person who would do this, just doing it by hand would work. However, if you program it you would have more reliability.

Here is a starting point. Look up and carefully explore this topic: FileSystemObject.


That link gets you to the Microsoft "Docs" site. Using the tree-like structure on the left of that page, you can drill down to explore topics including how to do file copy operations and file delete operations programmatically. VBA and the "FSO" work together very well and in fact, for what you need, VERY simply.



Actually I understand that, and have been known to be a "bull in a china shop" looking for a better way - and breaking a bunch of things along that path to the better way. Pat and I (as well as many others in the forum) have LOTS of experience with bloat. The problem with bloat isn't fixing it. It is fixing it in a way that doesn't disrupt others and/or in a way that doesn't tie up the database for a very long time trying to unravel everything and/or in a way that doesn't leave a mess behind you.

If you are a student of ancient history, you will have heard of Alexander the Great and his solution for untying the Gordian Knot. His solution was to take out his sword and cut it apart. Well, the C&R method is analogous. You don't untangle the blocks of deleted data. Instead, you make a new empty file and then extract everything you wanted to keep into that new file, then delete the old file in its entirety. (That IS the way a Compact & Repair operates!) The "Template" method is simply a way to selectively delete the old pile of obsolete data, create a new file, and re-tie the knot.
Thanks for the good explanation. I will dig into the FileSystemObject link.
I'm just surprised that Excel and Access don't have a better solution for this.
 

Gasman

Enthusiastic Amateur
Local time
Today, 23:25
Joined
Sep 21, 2011
Messages
14,310
Nothing to do with Excel?
Data could just as easy come from AS400 output for instance?
That is just the way Access works? Best get used to it. :)
 

CJ_London

Super Moderator
Staff member
Local time
Today, 23:25
Joined
Feb 19, 2013
Messages
16,616
Just a small word from me - you will get fewer issues importing a text file rather than an excel file. And a text file opened and then saved in excel can create even more problems.

not clear whether the download is text or excel
 

Users who are viewing this thread

Top Bottom