VBA Code to run when a new record is inserted into a Table

kmanickam

New member
Local time
Yesterday, 20:26
Joined
Apr 7, 2018
Messages
7
I want to create a VBA program that will monitor Table 1 and when an update is made ( a new record is added to Table 1) it would run a query or vba code that would take the newly inserted record from Table 1 and insert into Table 2.

I want this VBA program to constantly run so that it keeps checking Table 1 and whenever a new record is updated it would run the VBA code to insert the newly added record from Table 1 to Table 2.

Any code example on how to do this would be greatly appreciated.
 
Although this could easily be done, I am compelled to ask why?
 
Because we are currently working in a sub-optimal environment where we are having poor network performance and we are creating a solution that is a potential workaround from having users download data from a 'network' table by instead having them work with data that is local 'ie' avoiding server/network traffic.

Any help in implementing the solution I outlined in my post would be greatly appreciated.
 
Because we are currently working in a sub-optimal environment where we are having poor network performance and we are creating a solution that is a potential workaround from having users download data from a 'network' table by instead having them work with data that is local 'ie' avoiding server/network traffic.

Any help in implementing the solution I outlined in my post would be greatly appreciated.

Not trying to be awkward but the problem with this approach is having multiple users each using their own copy of this table and updating the records.
How are you going to ensure the 'correct' data is saved back to the network?
 
Because we are currently working in a sub-optimal environment where we are having poor network performance and we are creating a solution that is a potential workaround from having users download data from a 'network' table by instead having them work with data that is local 'ie' avoiding server/network traffic.

Any help in implementing the solution I outlined in my post would be greatly appreciated.
 
Did you intend to just repeat your previous answer?
 
With the current setup there is one source database and when a user logs on to our other database it reads their computer username and their task. It then downloads the data associated with their task from the source database to a table that is located on the other database. When this occurs other users are prevented from being able to use the database at the same time as it is locked. After a user completes all their updates data that is updated will be transferred from the local table back to the source database.

We've got this part worked out the only part that we don't have worked out is a vba program/code that checks for updates to the source table and runs an update query or vba code to insert the most recently updated data from the source table to the local table for user editing /updating.
 
With the current setup there is one source database and when a user logs on to our other database it reads their computer username and their task. It then downloads the data associated with their task from the source database to a table that is located on the other database. When this occurs other users are prevented from being able to use the database at the same time as it is locked. After a user completes all their updates data that is updated will be transferred from the local table back to the source database.

We've got this part worked out the only part that we don't have worked out is a vba program/code that checks for updates to the source table and runs an update query or vba code to insert the most recently updated data from the source table to the local table for user editing /updating.

Sorry new to forum and learning how to use the forum features
 
Ridders post is spot on with what can do wrong with this approach. I had a similar situation with people who traveled and had to synchronize when they returned.

It was not that difficult becuase we had our BE on Sharepoint. There were a few conflicts that I would have to resolve, but it was not that daunting a task.

Give this link a read, there may be something in there you could use...

https://support.office.com/en-us/ar...database-03822632-da43-4d8f-ba2a-68da245a0446
 
I have to say that your proposed solution of caching stuff locally for updates and then trying to re-synchronize with the back end is even more sub-optimal than having network issues. You are essentially bypassing those features of Access that would protect you and would eliminate your need to lock the table for all users except one. I have worked with a miserably slow back end that was across a reliable but very slow network and I know I would FAR prefer that case to having an unstable network.

Therefore, one part of my question to you is about "poor network performance" and what exactly you mean by it.

If (a) the network drops connections frequently, or (b) the network is slower than tapping tree sap for molasses in a Wisconsin February, or (c) something else.

For case (a), your staging solution is no better. A drop is a drop and will kill the back end database whether you are directly updating or doing a two-stage update. If the network is crap for a direct update, it will be crap for a two-staged update. In the end analysis, Access must do the work in the host and that means a ton of SMB protocol traffic between front-end and back-end. Probably with the same amount of data. But even worse, if you pull down EVERYTHING the user COULD have touched and then upload it all again (rather than only what the user actually touched), you actually expose yourself to a greater chance of disruption.

For case (b), the arguments in (a) apply except that you would face less danger of having a fatal network drop-out. Your real solution is to assure the users that they need to be patient and WAIT for the network to catch up to you.

For case (c) which is neither instability nor speed, describe your situation, please.

If you go to a more direct update method and assure that your tables are properly indexed to support your chosen selectivity, then Access will be smart enough to selectively fetch and update records for you without you having to go through the headache of after-the-fact determining what was updated.

Then, of course, there IS the odd chance that if enough people tell the boss that his iceberg-slow network is costing him productivity, he MIGHT do something about it.
 
I was just in the process of writing a less eloquent version of what the doc man has just written. Agree with every word he wrote.

Whether it's what you want to hear or not, the necessary solution is to fix the network issues. As there have now been three replies all questioning this strategy and deliberately refraining from explaining how to implement it, I suggest you step back from this plan, describe the network setup and the precise issues with it.

Where are your backend tables stored? Sql server, access or SharePoint or...?
Wired or wireless? LAN or WAN or cloud? Etc ...
 
Last edited:
The network is slower than molasses. It is taking 3-4 minutes to populate lookup tables, forms, etc. and in some cases non-responsive. We didn't have this issue before but we have been told that the network will not be upgraded and figure out a workaround.

So therefore the new process is to create linked tables and then populate local tables on the database so that all lookups, forms, etc. can be processes in seconds vs. minutes or no response.

We have an alternative solution working but we want to further improve the performance because it is still slow 1-2-3 minutes.

So we are looking to refine the alternative so that instead of populating the local tables when a user logs in which still takes a couple of minutes. We are trying to implement something that once there is an update to the source tables that it will 'trigger' our VBA code to run which will download the most recent data to the local tables so when a user signs in it will just 'filter/query' on the local tables and the user is ready to add, edit, or delete information.
 
If we had the option to improve the network we obviously would have done so. It's just not an option we have to develop a workaround.
 
I'll try asking these questions again

Where are your backend tables stored? Sql server, access or SharePoint or...?
Wired or wireless? LAN or WAN or cloud? Etc ...

Please give details of your network.
Also please explain what steps have been taken to optimise the tables e.g. indexing etc

What exactly do you mean by lookup tables? Tables with comboboxes?
 
We didn't have this issue before but we have been told that the network will not be upgraded and figure out a workaround.

Typical boss-speak for "I'm too cheap to consider this." If the boss knew how many person-hours were wasted waiting for your network, s/he would not be so sure that the network won't be upgraded. If lost labor were coming directly out of the boss's pocket, the answer would be different.

It's just not an option we have to develop a workaround.

OK, let's work with what you have now. But this will go only so far before it becomes intolerable. Your boss might have to yell a lot but the answer isn't going to change past a certain point. Be prepared for that.

Your best solution is to abandon that multi-staged approach. The fewer touches you have to make to your tables, the better off you are. Learn how to make your back end (BE) shared optimally. Having a "local" lookup table works only for relatively static tables. If the tables have to be re-synchronized often enough, they get in the way.

One strategy is to assure that all SELECT queries be tuned to NO LOCKS and all action queries (insert, update, delete, make table) be tuned to OPTIMISTIC LOCKS. NEVER EVER use pessimistic locks. NEVER try to lock things yourself. Let Access manage all locking for you and don't give it too many locks to manage.

Always use queries that do any lookups via JOINs to translation tables (as opposed to having some kind of DLookup or equivalent in the query).

Always use pre-defined / named queries, with parameters where necessary, rather than building ALL queries on-the-fly. The queries that appear by name in the AllQueries list have all been pre-analyzed and pre-optimized. The stuff on-the-fly has not

Assure that whatever you are using as the selector variable for each user (maybe their login ID?), that in ANY table where it is relevant, it is indexed even as a foreign key (FK) and even if marked DUPS ALLOWED.

If your record IDs are short enough, a natural key is OK. If you have a key that is much longer than four bytes, consider whether a synthetic key (such as an autonumber) would be better for the primary key (PK). (Long keys take longer to process on a slow network because more data needs to be fetched.)

Remember that all parent-child relationships require the PK of the parent to be indexed with NO DUPS. (That's why it's called a PRIME key).

It IS possible to have too many indexes - Access limit is 10 per table including whatever is the PK. But having one key for every field on which you commonly do searches or selections or JOINs is not unreasonable.

Do AS MUCH WORK AS POSSIBLE in queries and AS LITTLE WORK AS POSSIBLE in recordset operations and VBA. This is because with extremely few exceptions, VBA is slower than SQL. That's because VBA is implemented as an emulator for a virtual machine that doesn't exist. SQL, however, IS compiled and the setup for the query is quick. For recordsets, you have to emulate your code for every record. For SQL, once the query gets rolling, that's pure compiled code that is running the show.

To safeguard yourself, look up the DAO .Execute method on-line and learn to use it for all SQL action queries if you are not already doing so. Using DoCmd.RunSQL works but can be slower. The down side to.Execute is a bit more strenuous care to the setup of the SQL string that you want to execute. The up side is that the .Execute when used with the option dbFailOnError will AUTOMATICALLY roll back the action query (as though you had used a one-query BEGIN TRANSACTION/COMMIT pair.) AND you can check for the DAO object's .RecordsAffected property to see how many records were updated, inserted, or deleted. If there is an error under .Execute with dbFailOnError, the whole action sequence is rolled back and you can predict where you are. Also, the error that gets signalled is usually at least as good and often is BETTER than the errors you get for .RunSQL cases.

The reason .Execute is faster than .RunSQL is that .Execute goes directly to the DB Engine whether we are talking Jet (old versions) or Ace (new versions), whereas .RunSQL is run from inside the Access environment. There IS a small difference in execution speed, but this small speed difference is magnified by having a slow network. TRUST me on this fine point - I have been there and done that. No fun, I assure you. But .Execute IS faster on a slow network.

Using temporary tables can be workable but only if you take great care in the data re-synchronization phase of what you are done. You want to design your temps to use the MINIMUM number of records possible to do what needs to be done.

This should get you started in re-thinking what you have been doing. If you have more specific questions, post them. We are literally world-wide. I'm in USA Central Time Zone which at the moment is 5 hours behind GMT.
 
You should also consider converting to SQL Server (the Express version is free and will work for small groups. To make the most of the conversion, you will need to make some changes to your forms. You will need to add search criteria so that your edit forms return only a single main form record. The point of using queries with criteria is to minimize network traffic. Users can't deal with thousands of records anyway so let them retrieve them one at a time. Depending on how the app is structured, this may be easy or difficult but when the network is your problem, you need to minimize the traffic as best you can. Also, since the relationship between SQL Server and the Access FE is asynchronous, there is much less of a problem with a flaky network since Access doesn't expect to be permanently connected to the BE as it does with Jet and ACE.
 
You use Data Macro, definitely will solve your problem.
 
I was going to eventually suggest a data macro ( assuming it was AC2010 or later), but with the network issue the OP described, I was unsure it would be a solution so I kept silent. Didn’t want my hand slapped again ;)!
 
yes NG data macro, but i think it is easier on it's VBA counter part. Just add code to the AfterInsert Event of the Form:


Private Sub Form_AfterInsert()
'' insert this record to the record in table2
'' Rember to properly delimit your Textboxes.
'' double/single quote for text.
'' # for dates (plus american english format)
''
currentdb.execute "ïnsert into table2 (field1, field2, field3) SELECT " & me.field1 & "," & me.field2 & "," & me.field3

End Sub
 

Users who are viewing this thread

Back
Top Bottom