Data Macro - Strange Performance Issue

mellamokb

New member
Local time
Today, 05:24
Joined
Jan 8, 2025
Messages
4
Hi All-

We are exploring the use of Data Macros to add audit tracking to our core product. But it has strange effects on performance that make it unreliable solution. Maybe someone can help diagnose the following.

Here is simplest way to repro.
1. Build blank app with a table like "tblEmployee" with a bunch of fields, and one field called "LastUpdated" as Date/Time
2. Add column to table called "ListOrder" which is numeric order.
3. Write VBA script that loops through recordset to table, setting ListOrder to 1, 2, 3 according to Last/First name order.
4. Run script - note that it is very fast, say 5-10 ms for 100 employees.

Now, add a Before Change Data Macro that does SetField LastUpdated to Now(). If you run test VBA script, you'll notice it is still very fast, maybe only a few ms slower.

Then. If you create a new test form, with a ListBox that is bound to tblEmployee, and open the form. Run the VBA script while the form is open, you'll notice again it is still fast.

BUT - If you put a button on the form with a Click event that calls the VBA script above, when you click the button it is 10-20x slower! Uh, what?

This behavior is very mysterious and gives the impression that the simple act of adding a Data Macro to a table, has all sorts of bizarre performance side effects, like suddenly something has to "wait for synchronization" and only if it is an action performed on a bound form. The difference goes away when the Data Macro is removed.

I'm at a loss how to even troubleshoot this further, let alone devise a general solution to our entire application with 250k LOC and 100's of forms. Obviously this one example could be addressed with a workaround, but that is an unexpected problem with unknown effects all over our entire application.

This is a long shot, but any inspiration or insight from the community would be welcome. Specifically with how to build a Data Macro for auditing that doesn't have adverse side effects to a legacy LOB application.

Best Regards
 
Hi. Welcome to AWF!

The instructions from this website said to use the AfterUpdate event. I wonder if that would make any difference to your experience of using the BeforeUpdate event?
 
I am kind of curious on this and for people who employ data macros. Data macros do not seem commonly used in Access. I never see people asking about them, rarely see any examples, and not a lot of discussion on the internet. I will admit, I kind of avoided them mainly because of the above reasons. Even when they were introduced there was not much fanfare. Also they came out about the time as multi value fields, split forms, and navigation forms so I threw them into the pile of solutions for non problems. But if they are not problematic as described, they would seem very worthwhile.
 
Thanks, theDBGuy!

Main reason I chose Before Update is because I wanted to update a column in the same row, and I don't think I can do that easily with After Update. Plus I thought it would be the highest performance at scale/volume - This is primarily for the foundation of a sync tool, and I just need to identify the rows that changed since a last timestamp.

As an experiment I went ahead and built an After Update Data Macro instead writing a simple row to an audit table. In my test with 100 employee test table, the VBA Loop to update all records took around 100 ms. Then when I kicked it off from a Command button on a form instead it took 2,500 ms, so likewise seems to show the same 20x performance slowdown.

majP - Agreed, honestly if it wasn't for this strange behavior, this would be the perfect general solution to our problem.
 
AWF moderator @jdraw published an example here a few years ago so he may be able to contribute to this discussion
 
Hi All-

We are exploring the use of Data Macros to add audit tracking to our core product. But it has strange effects on performance that make it unreliable solution. Maybe someone can help diagnose the following.

Here is simplest way to repro.
1. Build blank app with a table like "tblEmployee" with a bunch of fields, and one field called "LastUpdated" as Date/Time
2. Add column to table called "ListOrder" which is numeric order.
3. Write VBA script that loops through recordset to table, setting ListOrder to 1, 2, 3 according to Last/First name order.
4. Run script - note that it is very fast, say 5-10 ms for 100 employees.

Now, add a Before Change Data Macro that does SetField LastUpdated to Now(). If you run test VBA script, you'll notice it is still very fast, maybe only a few ms slower.

Then. If you create a new test form, with a ListBox that is bound to tblEmployee, and open the form. Run the VBA script while the form is open, you'll notice again it is still fast.

BUT - If you put a button on the form with a Click event that calls the VBA script above, when you click the button it is 10-20x slower! Uh, what?

This behavior is very mysterious and gives the impression that the simple act of adding a Data Macro to a table, has all sorts of bizarre performance side effects, like suddenly something has to "wait for synchronization" and only if it is an action performed on a bound form. The difference goes away when the Data Macro is removed.

I'm at a loss how to even troubleshoot this further, let alone devise a general solution to our entire application with 250k LOC and 100's of forms. Obviously this one example could be addressed with a workaround, but that is an unexpected problem with unknown effects all over our entire application.

This is a long shot, but any inspiration or insight from the community would be welcome. Specifically with how to build a Data Macro for auditing that doesn't have adverse side effects to a legacy LOB application.

Best Regards
FWIW: I think the problem is not the DM itself, but the way in which it is used.

Here: "Now, add a Before Change Data Macro that does SetField LastUpdated to Now(). If you run test VBA script, you'll notice it is still very fast, maybe only a few ms slower."
In this case one record is being modified by the Data Macro. That record is, of course, the record where you change the LastUpdated value.

Here: "Write VBA script that loops through recordset to table, setting ListOrder to 1, 2, 3 according to Last/First name order."
If you are updating one field in that table and doing so on every record in a Recordset loop, that DM has to fire each time that update happens, i.e. it fires as many times as there are records in the recordset. Whatever else is happening, that seems to me to be a pretty slow process. I'm assuming here that I accurately read the description of what you are doing.

In the first instance, DM fires once.
In the second instance, DM fires repeatedly as the ListOrder is upated. Yikes.

It also leads a question:

What's the point of the ListOrder field? You can sort the table on Last/First in a query or report and not even use that calculated field? So, what problem does adding that field solve? Where can you use it that you can't use the Last/First sort?

If you insist on keeping ListOrder, I would think you can modify the DM to ignore changes to the ListOrder field and only fire when at least one other field is updated, limiting it to the record that has actually been updated for some reason other than recalculating the ListOrder. That way it fires once, not multiple times.
 
I have avoided Data Macros, not because of any currently known problems but because, most of my BE's either immediately or potentially get upsized to SQL Server and I would have to convert the Data Macros to triggers and I'm not sure there is a 1-1 conversion so rather than have to fix the problem later, I don't create it. Also, it really depends on the circumstances of your updates. If the tables are updated primarily with forms, then it is probably easier to have all the validation code in VBA in the BeforeUpdate event of the form. this makes testing and debugging easier. Since the Data Macros are little used, a new person taking over the app, will not immediately think to review the tables for the potential of data Macros so you should document their use carefully and clearly. So, the need to use them comes down mostly to:
1. Are other applications linking to your BE and performing updates? This becomes dangerous if they other team hasn't properly implemented your business rules. You can overcome this to some degree by putting your validation into Data Macros, But, and there is always a but, the Data Macros are not a full featured solution and cannot run VBA so you cannot do ALL your validation there and ultimately, that is my real problem with them.
2. Is YOUR app updating the tables from multiple forms (see #1 because you have to change the way you do validation to avoid having to create duplicate code).
 
Thanks for the numerous responses!

First, I am aware this is not ideal, you're preaching to the choir :) We're on an evolutionary strategy toward something better, but it's a multi-year plan.

Second, I want to be careful to communicate the subtlety of what I'm asking.

The DM performance is by itself reasonable. What is problematic, is the specific use of the DM in the context of a form actively bound to the same table.

The actual VBA code in the production application looks like this, in a function called "ReorderEmployees"

Code:
qs = "SELECT * FROM tblEmployee ORDER BY LastName, FirstName;"
Set rs = coDb.OpenRecordset(qs)

i = 1
With rs
    Do While Not rs.EOF
        .Edit
        !ListOrder = i
        .Update
        .MoveNext
        i = i + 1
    Loop
End With

Note the following scenarios:
1. When I call this function directly from Immediate Window - takes around 10 ms.
2. When I have Before Update DM on the table, and I call this function directly from Immediate Window - takes around 10 ms.
3. When I call this function on Form_Open without DM - takes around 10 ms.
4. When I call this function on Form_Open on a form bound to tblEmployee, with DM on the table - takes around 5-6 seconds.

What is the difference between #2 and #4? They both have the DM installed, and they both run the exact same code, looping through the exact same number of rows in exactly the same way... that is the mystery I'm asking about.
 
I have avoided Data Macros, not because of any currently known problems but because, most of my BE's either immediately or potentially get upsized to SQL Server and I would have to convert the Data Macros to triggers and I'm not sure there is a 1-1 conversion so rather than have to fix the problem later, I don't create it. Also, it really depends on the circumstances of your updates. If the tables are updated primarily with forms, then it is probably easier to have all the validation code in VBA in the BeforeUpdate event of the form. this makes testing and debugging easier. Since the Data Macros are little used, a new person taking over the app, will not immediately think to review the tables for the potential of data Macros so you should document their use carefully and clearly. So, the need to use them comes down mostly to:
1. Are other applications linking to your BE and performing updates? This becomes dangerous if they other team hasn't properly implemented your business rules. You can overcome this to some degree by putting your validation into Data Macros, But, and there is always a but, the Data Macros are not a full featured solution and cannot run VBA so you cannot do ALL your validation there and ultimately, that is my real problem with them.
2. Is YOUR app updating the tables from multiple forms (see #1 because you have to change the way you do validation to avoid having to create duplicate code).
Data Macros were introduced at the point where Access Web Apps were launched. They were created to support Access Web Apps specifically, although they are usable still. Originally, in fact, it's my understanding that they were implemented as triggers on the Azure SQL tables used with AWAs.

I'm not sure I'd consider using Data Macros for data validation purposes. I see DMs as being useful for tasks like creating change audits in an audit table, or as here, to set change tracking values in a LastUpdated and LastUpdatedBy field.
 
FWIW, I think they have their uses. I never did a comparison test, but they seem to be faster and more efficient that using append/update queries. Perhaps Colin could be coerced persuaded to do one of his famous Myth-Busters evaluations on this...
 
Thanks for the numerous responses!

First, I am aware this is not ideal, you're preaching to the choir :) We're on an evolutionary strategy toward something better, but it's a multi-year plan.

Second, I want to be careful to communicate the subtlety of what I'm asking.

The DM performance is by itself reasonable. What is problematic, is the specific use of the DM in the context of a form actively bound to the same table.

The actual VBA code in the production application looks like this, in a function called "ReorderEmployees"

Code:
qs = "SELECT * FROM tblEmployee ORDER BY LastName, FirstName;"
Set rs = coDb.OpenRecordset(qs)

i = 1
With rs
    Do While Not rs.EOF
        .Edit
        !ListOrder = i
        .Update
        .MoveNext
        i = i + 1
    Loop
End With

Note the following scenarios:
1. When I call this function directly from Immediate Window - takes around 10 ms.
2. When I have Before Update DM on the table, and I call this function directly from Immediate Window - takes around 10 ms.
3. When I call this function on Form_Open without DM - takes around 10 ms.
4. When I call this function on Form_Open on a form bound to tblEmployee, with DM on the table - takes around 5-6 seconds.

What is the difference between #2 and #4? They both have the DM installed, and they both run the exact same code, looping through the exact same number of rows in exactly the same way... that is the mystery I'm asking about.
Can you share a repo accdb to give us a first hand look?
 
GPGeorge - I think this should do. It has tblEmployee with Before Change DM that sets LastUpdated on affected rows. Then a frmTest with action button, and a bound ListBox to tblEmployee.

If you directly run test() Macro (like immediate Window) which calls ReorderEe, it prints something like 7.8 ms to run.

But if you run ReorderEe by clicking Command18 on frmTest, it typically takes 250 ms to run.

I would expect there to be no difference.
 

Attachments

Just out of interest, are you running the app as a trusted document or from a trusted location?
The two have very different effects with the latter being around 20x faster in my tests.
See my article:
 
With reference to post #5, I am jdraw and did provide a sample database at AWF. The whole thread has a discussion of using data macros.

You might want to describe/provide an example of what you mean by "audit tracking". Typically you would have an audit table and a record for each change to your operational tables. My sample use of data macros database shows the audit table (tblAuditLog) with audit records.
I agree with George that your testing raises questions -- updating all records isn't typical.

I modified your code to print start and end time to:
a) insert 1000 records, and
b) run the code behind the button (updating 1000 records)

Adjusted code:
Code:
Function InsertEe()
    Set db = CurrentDb
    Set rs = db.OpenRecordset("tblEmployee")
    Debug.Print "Insert Start: " & Now() '===============
    For i = 1 To 1000
        rs.AddNew
        rs("FirstName") = "First" & i
        rs("LastName") = "Last" & i
        rs.Update
    Next
    Debug.Print "Insert End: " & Now() '================
    rs.Close
    Set rs = Nothing
    Set db = Nothing
End Function
[\code]
[code]
Function ReorderEe()
    Dim t As Single
    t = Timer
    Debug.Print "ReorderStart: " & Now() '==============================
    Set db = CurrentDb
    Set rs = db.OpenRecordset("Select * From tblEmployee Order By LastName, FirstName")
    i = 1
    Do Until rs.EOF
        rs.Edit
        rs("ListOrder") = i
        rs.Update
        rs.MoveNext
        i = i + 1
    Loop
    rs.Close
    Set rs = Nothing
    Set db = Nothing
    Debug.Print "ReorderEnd:   " & Now() '==========================
   ' Debug.Print 1000 * (Timer - t) & " ms" ' commented out==============
End Function

My debug prints:(1000 records)
Code:
Insert Start: 08-Jan-25 6:20:37 PM
Insert End: 08-Jan-25 6:20:37 PM
ReorderStart: 08-Jan-25 6:21:03 PM
ReorderEnd:   08-Jan-25 6:21:04 PM
 
I agree with George that your testing raises questions -- updating all records isn't typical.
I have the same question. I assume that list order is for example purposes only and not something you are really doing.

However my next question for the audience is if the Data Macro is exposed in VBA. I think If it was super slow when doing a batch operations I would try to turn it off, do the batch, and turn it back on. Have the code for the batch handle the audit. Is there a way to turn off (delete) the macro before the batch and Turn on (add) the data macro back after the operation. Then since you are doing a batch update each audit through the batch code.

I will do something like this with event procedures.
Write the event procedure. Then to have the event stop
Me.CmdBtn.Onclick = ""
to turn it on
me.CmdBtn.OnClick = "[Event Procedure]" or a function "=SomeFunction()"
 
Is there a way to turn off (delete) the macro before the batch and Turn on (add) the data macro back after the operation.
@MajP No, unfortunately data macro is bind to table and can not be manipulated from vba. (as far as I know. I may be wrong though)

But there's a work around.
When I was new to Access and knew nothing about how things work, (which still am not any better), I used to use a trick.
  • Add a Yes/No field to your table. Let's call it ByPass. Set the default value to Yes.
  • Put your data macro inside an IF that checks if ByPass is True. This ensures that Data Macro is effective only if the value of ByPass is Yes.

2025-01-09_14-18-47.jpg


To turn On/Off Data Macro from VBA, you need to run an update query on ByPass field. For what you want to do above :
  • In vba run an update query and set the value of ByPass to False.
    You don't need to update all records in the table. Only the records you need for your next batch update.
  • Do whatever you want with the table. Run a batch query, delete, or any other action you need.
  • Run an update query and set back the value of ByPass to True.
 
Last edited:
modified your code and now runs at 136 ~ 140 ms.
 

Attachments

Last edited:

Users who are viewing this thread

Back
Top Bottom