Currently, I have an Excel Sheet that Imports using the
This excel sheet has 16 columns and about 52,000 Rows. It does have a unique id. There about 10-20 rows are added 5 days a week. Also, 20-50 rows could have between 1 and 12 columns updated daily. (this comes from a separate system. I currently don’t have a way to do an ODBC connection to that system.) This Table is the primary table of the database.
Screenshot (Fake Data)
When we update the excel sheet it will contain the 52,000 rows plus any new rows and the rows that have updates.
Currently, I am deleting the table and then importing a new excel file. This is working I am just wondering if this is the best way to handle it. Or should I be doing an append and update? I have 11 other tables that use the RowID as a foreign key, but I noticed that when I delete the table “all_matters” It deletes the relationships. Also, I am worried about when I have multiple users it could cause problems. I was thinking about importing excel into a table “all_matters1” Then trying to do an update and append to “all_matters” Honestly, I am not sure of the best way to handle this. I would appreciate any advice.
Also, while talking about design I am thinking about splitting my Database into a front end and backend but had a question. If I split it is there a way to make a copy of it that can be put on a laptop to use offsite, then data merged back in with the main database? (This would need to be easy because it would be done quite frequently.) If so, how would I do this?
Code:
DoCmd.TransferSpreadsheet acImport, acSpreadsheetTypeExcel12, "all_matters", "R:\Close Files\accessimport.xls", True
This excel sheet has 16 columns and about 52,000 Rows. It does have a unique id. There about 10-20 rows are added 5 days a week. Also, 20-50 rows could have between 1 and 12 columns updated daily. (this comes from a separate system. I currently don’t have a way to do an ODBC connection to that system.) This Table is the primary table of the database.
Screenshot (Fake Data)
When we update the excel sheet it will contain the 52,000 rows plus any new rows and the rows that have updates.
Currently, I am deleting the table and then importing a new excel file. This is working I am just wondering if this is the best way to handle it. Or should I be doing an append and update? I have 11 other tables that use the RowID as a foreign key, but I noticed that when I delete the table “all_matters” It deletes the relationships. Also, I am worried about when I have multiple users it could cause problems. I was thinking about importing excel into a table “all_matters1” Then trying to do an update and append to “all_matters” Honestly, I am not sure of the best way to handle this. I would appreciate any advice.
Also, while talking about design I am thinking about splitting my Database into a front end and backend but had a question. If I split it is there a way to make a copy of it that can be put on a laptop to use offsite, then data merged back in with the main database? (This would need to be easy because it would be done quite frequently.) If so, how would I do this?