Error when clicking too fast

I have seen this behavior on SQL server-connected backends occasionally. Almost as if access continues to process things, when it shouldn't. I can't remember the app concerned, unfortunately.

The only way I was able to totally prevent it was to create a small generic routine that I passed the action to that popped up a modal form "Processing" and had nothing on it except a close event after the action had been processed.
I put a cmd button on it that did nothing when pressed, so it seemed to capture the random key presses
It was made easier by having a generic function to run the pass-through queries, that knew if it was returning records or not.

It seemed to cure the problem, but it was a minor pain to implement.
 
If you filter: what happens to the result? Shouldn't that be displayed somehow? What is the purpose of another button click out of impatience?
Sorry, you've lost me a bit here. When I filter (I assemble a text string that is submitted to the SQL Server engine via an ODBC link), I get back a recordset, exactly as I would if I were opening a local table, only not so quickly. Yes, it is displayed. A count request is submitted first, to warn the user if the number of records to be returned is larger than a limit, which each user can set for himself. If below the limit, or the user clicks a "Show me anyway" override button, the recordset is displayed. If a record is editted, that is done in a dialog form, and that form closes upon submission of the editted record. As the form closes, the underlying form is repopulated, which can take up to a couple of seconds, depending on the filtering criteria.

The background color of the form is changed to bright yellow as a caution light to indicate the DB is working on something, and when the recordset is repopulated, the last act before turning control back over to the user is to return the background color to green. But some users will start to click on other filtering options while the form is still yellow and Access is waiting for the recordset to return, either after an edit or after submitting some other filtering criteria, and the user decides he wants something else before seeing the results of what he initially asked for. Instead of either ignoring the click, or queueing it up and dealing with it AFTER the recordset is populated, Access catches such a click and throw this goofy message. It doesn't really hurt anything - all you have to do is click the Ok button, the only button in the message box, and it will continue like nothing had happened. But it looks dorky and spooks the users. It should not be doing that.
 
I have seen this behavior on SQL server-connected backends occasionally. Almost as if access continues to process things, when it shouldn't. I can't remember the app concerned, unfortunately.

The only way I was able to totally prevent it was to create a small generic routine that I passed the action to that popped up a modal form "Processing" and had nothing on it except a close event after the action had been processed.
I put a cmd button on it that did nothing when pressed, so it seemed to capture the random key presses
It was made easier by having a generic function to run the pass-through queries, that knew if it was returning records or not.

It seemed to cure the problem, but it was a minor pain to implement.
That seems promising. A pain, as you say, but I'm already showing a different background color - I could just add it to that. But if it's modal, doesn't it stop all other processing as long as it's displayed? And if it's not modal, what is to stop the user from clicking a button under it, unless you make it so large that it covers the entire screen?
 
I WAIT FOR THE RESULT!!!
Don't hold your breath. And don't believe your lyin' eyes. Access isn't waiting and therefore, you did, intentionally or not spawn an asynchronous process.
I do not do any batch processing. That is a very antiquated concept
OK, I'll tell that to my clients (like American Express) that create tens of thousands of invoices every day and we won't forget the one that has to communicate all daily activity to the State at the end of the day. And then process the return file the next day in order to ensure the billing information is all valid so that the State can pay the service providers. And then there's the client that every day has to retrieve usage data from the electric company for all clients using their electricity providers. Sometimes the batch communication is bidirectional and sometimes it's not. Usually it is large volumes of data - my EDI to the state can get to a million rows so I cut it into pieces. Not because of any problem except for the FTP. It can be flaky with very large files. EDI isn't particularly efficient. It is controlled by a third party and basically serves the lowest common denominator. It is still far better than trying to get agreement between dozens of business partners on what a transaction should look like.

Apparently you've never heard of EDI. That's how the real world runs. Doctor offices use it to send billing information to insurance companies. Auto manufactures use it to send orders to their suppliers. Suppliers send invoices to their business partners. It really isn't possible for every business to be connected to every other business so EDI is a middle man. It standardizes the communications very similar to the way ODBC standardizes SQL. When you use Access queries, you use Access' variant of SQL in. ODBC converts it to T-SQL out so that SQL Server will understand it. Or using a different driver, converts to Oracle or DB2 or whatever. Each company can create its own translation of its own data to the EDI transaction and vice versa or the company can buy software or use a service to control the whole exchange.
I don't know where you got your 'hundreds of thousands' notion.
If I can get virtually instantaneous results from multi-million row tables, you really ought to not have a problem with 130,000. That is a very small number of rows. Have you looked at your table schema? Have you looked at the relationships and the indexes? For complicated joins, have you considered making Views and using the views instead?
 
In order to stop Access, the popup MUST be modal. It will prevent you from clicking anywhere except on the close button of the popup. That is along the lines of what I suggested in my first reply but it does require that you identify the potential problem queries.

You also need to take a long, hard look at your schema and your queries if you are experiencing wait times long enough for users to get antsy. Take them into SQL server and run them there so you can see the execution plan that gets generated. For two reports, the selection was con convoluted, I ended up having to create stored procedures to assemble the separate recordsets and then put them together. Essentially, I had to control how the query engine evaluated the query and impose my will on it because it was choosing poor execution plans. Since you are not using bound forms, who cares if the recordset is not updateable.
 
There are about 130,000 records in the entire database
That's not much and no big reason to wait seconds.

Could it be that the query design of your queries is quite unoptimal?
Or even worse: With bad data structures (data model?) only bad queries are often possible.
Before creating help and workarounds, I would check the basics.
 
Don't hold your breath. And don't believe your lyin' eyes. Access isn't waiting and therefore, you did, intentionally or not spawn an asynchronous process.
If it wasn't waiting, it would be doing the things that come immediately after submitting the query. It is not doing any of those things, but it does commence doing them as soon as this stupid error message gets dismissed. So I conclude that it is waiting, no matter what you think.


OK, I'll tell that to my clients (like American Express) ...
Humble bragging is not attractive, and not relevant to the question I asked. Batch processing is fine if you have that sort of workload. I do not, and none of that has anything to do with this discussion.

Apparently you've never heard of EDI.
Also not relevant to the discussion.

If I can get virtually instantaneous results from multi-million row tables, you really ought to not have a problem with 130,000. That is a very small number of rows. Have you looked at your table schema? Have you looked at the relationships and the indexes? For complicated joins, have you considered making Views and using the views instead?
The pure number of records in a database is far from the only factor in determining how fast a query responds, which you must know if you are actually doing any of this stuff. Some things respond very quickly, some take a while. It is not possible to optimize for everything at the same time. I have this setup optimized for the stuff that users do the most, and I get nearly instant response on that. Some things take longer, and it is those less common ones that will occasionally lag long enough for the user to get ahead of the app.

130K is not a very small number of rows. It is obviously nowhere near any capacity limits for a serious database engine, but it is large enough to give it some work to do, if the joins and search criteria are complex enough. Yes, I could probably add some indexes and persisted views, but I don't want to bog down the update process too much. Adding all that stuff makes searching faster, and everything else slower.

You may also tell me to just get a faster server, so I will attempt to forestall that by dealing with it now. The server is fast enough for the work it's doing, and I'm not going to try convincing management to shell out thousands for a faster box, ESPECIALLY when there is no guarantee that such an outlay will fix anything. It's not a server problem, it's an Access problem - it's intercepting clicks when it's supposed to be waiting.

And it very obviously IS waiting - as I have written several times, the code immediately after the query executes only AFTER the error message is dismissed. It is NOT spawning an asynchronous process and going on its way. When user do not click in this period, all works exactly as it should - as soon as the recordset is returned, Access continues by executing the very next statement, right on through to the end.
 
In order to stop Access, the popup MUST be modal. It will prevent you from clicking anywhere except on the close button of the popup. That is along the lines of what I suggested in my first reply but it does require that you identify the potential problem queries.
That's how I understand modal forms as well. But if it stops all Access activity, how does Access then submit the query? All my modal forms work that way - everything stops until the user deals with the modal form. I pop up a modal form and the app will sit, waiting for the user to do something, rather than going to get the records the user asked for. Only after the user dismisses the modal form will Access continue, only then submitting the query and again, waiting for the results, which is what is causing the entire problem in the first place. I may be missing something, but I don't see how that solves the problem.

You also need to take a long, hard look at your schema and your queries if you are experiencing wait tim long enough for users to get antsy. Take them into SQL server and run them there so you can see the execution plan that gets generated. For two reports, the selection was con convoluted, I ended up having to create stored procedures to assemble the separate recordsets and then put them together. Essentially, I had to control how the query engine evaluated the query and impose my will on it because it was choosing poor execution plans. Since you are not using bound forms, who cares if the recordset is not updateable.
They're not exactly getting antsy, they're just a bit careless. It's not a delay where they get frustrated with a long wait time and start pounding on the keyboard, trying to get a response. In fact, it's sort of the opposite. Most of the app gives instant response, and they are used to that - click and it's done, immediately on to the next task, exactly as it should be. Computers should wait for users, not the other way around. But once in a while, something is not instant, and they have simply clicked on the next button in the process, oblivious to the fact that this time, the app was still doing something. That's what brings up the error message, and that is what I am trying to get rid of.
 
Would it be conceivable to place a transparent form over your actual form?
 
Would it be conceivable to place a transparent form over your actual form?
You mean like a mask, that would keep them from be able to touch anything, but would not mangle the display? That's an intriguing idea - thank you.

Never done anything like that, but it sounds very much worth trying. I'll give it a shot and see how it works.
 
But if it stops all Access activity, how does Access then submit the query?
It doesn't stop activity in the Popup. It just blocks everything else. So you have an OpenForm command in form1, the execution of code in form1 stops at that point and cannot resume until the popup closes. You can submit the query from the popup in the form's open event. You can give the user an "emergency" close button if that would make him feel better. When the popup closes, control returns to the form that opened the popup at the line following the OpenForm command.

This could be a transparent form. I don't remember how to create them but I'm sure there's directions somewhere.

and none of that has anything to do with this discussion.
Which of course we had no way of knowing. All we knew was that the user initiated some action and it took a long time. Well, batch processes can take a while. That's why we run them during off hours. My response was only because you turned up your nose at batch processing (the way you turn your nose up at bound forms) which I have to tell you is alive and well throughout corporations around the world. I can say that with authority having also worked overseas for a foreign government as well as a consultant in many large companies in the US. I just gave some companies that use it and why. Maybe, instead of turning your nose up at the best features of the product you are using, you might want to learn how to use those features.

130,000 is a lot of records for a human to process but it is a trivial number for Jet and ACE and insignificant for something bigger like SQL Server. You should not be experiencing any slowness unless what you are doing is sub-optimal.
 
Last edited:
It doesn't stop activity in the Popup. It just blocks everything else.
Oh, right. I wasn't thinking clearly about that.

Which of course we had no way of knowing.
I think you might have known, since there was no mention of anything like that until you brought it up to brag about how you had large clients, while constantly insisting that I was doing something which I most definitely am not - spawning an asynchoronous process.

Well, batch processes can take a while. That's why we run them during off hours. My response was only because you turned up your nose at batch processing
I know full well how batch processes work. I started on them back in the days when programming meant punching holes in 80-column Hollerith cards. I also know that it is something to be avoided, unless you have volums of data that cannot be handled on the fly. Almost every normal interaction humans have with computers these days has gone away from batch in favor of real-time transactions. Including banks and other large institutions that you mention. Most of the time, when I do something in a bank that deals with another bank, it happens instantly - literally. I have been in one bank while on the phone with a teller in another bank, and as the local teller tells me he has done what I asked, the teller on the phone says, 'Okay, it's here.'

Batch processes are now used only when volumes are huge. I said from the start I was dealing with a real-time system, where responses are expected now, not after a pile of stuff runs through. I do not turn up my nose at batch processing - it's fine as a fallback when large volumes of data are being done and nobody needs the results quickly. But it's not what I am doing.

(the way you turn your nose up at bound forms)
Nor do I turn up my nose at bound forms. I have no idea where you are dredging up all these behaviors you are trying to attribute to me. I said they were fine, as long as you were doing simple stuff. What I'm doing is far too complicated for a bound form, and it would be foolish to try using one. The right tool for the job, whenever possible.

Maybe, instead of turning your nose up at the best features of the product you are using, you might want to learn how to use those features.
Again, you are imagining things. I know how to use bound forms, and I do use them when appropriate. How you can sit there with a straight face and tell me that I 'should' be using a bound form when you have no idea what I'm doing is a mystery to me. Is it that you do not know how to use an unbound form?

130,000 is a lot of records for a human to process but it is a trivial number for Jet and ACE and insignificant for something bigger like SQL Server. You should not be experiencing any slowness unless what you are doing is sub-optimal.
Sure - it's a trivial number, IF everything is set up optimally for a particular task. Once again, it is not possible to optimize for everything. I have optimized what I could for the most common tasks, and some of that comes at the expense of other tasks. But as I have repeated so many times already, this is not a problem with SQL Server. It is a problem that Access is not queueing mouse clicks, but breaking out of a place where it is supposed to be waiting for a recordset to materialize in order to throw this superfluous and meaningless error message.

If you know something about that, I'd be happy to hear it. But I'm getting a little tired of all the pointless tangents.
 
I also know that it is something to be avoided
You have some really weird ideas. It is what it is. Any process that doesn't require constant human interaction is by definition batch. You can't just arbitrarily put something in one basket or another. The human interaction or lack thereof is the dividing line. And when someone complains about a long running process it is generally because they are trying to have a human interact with a batch process so no, I didn't not have any way of knowing that you were not making that mistake.

If you don't like the popup solution, I'm not sure what else I can suggest.
 
So, which is it? Why does Access not queue clicks while this code waits for a response?

If your back-end SQL engine uses pass-thru queries or any other asynchronous execution option, your problem is events based on external actions. A CLICK EVENT (button, text box, check box, you name it) is essentially a subroutine call triggered by an external I/O interrupt from the input buttons of the mouse. I may have said that awkwardly but I will clarify: A click event is converted from a physical device interrupt to a synchronous, queued event call and is threaded into the sequence of event calls that would normally be associated with the form. I BELIEVE but cannot PROVE that a click event is threaded to the front of the event-pending queue. (Can't prove it because Microsoft's Access code is not open-source so can't see how they actually do it.)

By way of contrast, the actions that are caused by bound form status changes (OPEN, LOAD, CURRENT, AFTER UPDATE, etc.) are SYNCHRONOUS subroutine calls triggered when Access is performing those implied actions for you. However, Access treats them in a very similar fashion. Every action has a "hook" in the form of an event routine entry point that you can declare to insert your own code into the sequence. Every event is NOT called immediately on execution of the related action. What really happens is that the front event of the event queue is executed next. That is, all events are linearized. From this discussion, it should be obvious that one event cannot interrupt another event, because they use a queue structure to keep track of what they are going to do.

By using unbound forms, you make the form's status-change events short-lived, which means you return from YOUR event code to the Access GUI code (which is invisible to you). Those button clicks CANNOT interrupt a running event routine but if you are just sitting at a screen where you have launched a back-end async process and have then returned from that process launch, you are not running event code. (Access is running GUI code.) So clicking the button will immediately queue up an event as described earlier, and because no event code is blocking it, those pesky little button clicks will make it through.

It's all fine to tell me it's my own fault, but it's not very helpful.

Agreed. So I have explained exactly WHY you are seeing this behavior. Your solution is to find a way to continue running event code while waiting for the process to complete. Just because you have launched a back-end query on a "smart" SQL engine doesn't mean you have to return from the routine that launched it. What you want to do during that time, I cannot say because that would be your choice. But the solution to those unwanted button-click executions is to run event code, because one event cannot interrupt another event.

EDIT: If you have two forms open at once, neither of which is bound, it is NOT CLEAR whether you can run processes under both forms. However, if only a single form is active, then an event on that form cannot interrupt another event on the same form, and that is because Access DOES NOT USE the API call that creates what is called a "FORK" process (parallel execution thread).

Note that for the purposes of this discussion, a TRAP handler is NOT event code even though you define it in what appears to be event context. A trap CAN interrupt event code (but does not interrupt another TRAP routine). I include the note only to forestall questions about interrupts.
 
Last edited:
You have some really weird ideas.
Many things seem weird to people who don't understand them. But that's okay, I'm not trying to educate the world, just solve my own little problem.

It is what it is.
I don't know how to respond to meaningless tautologies.

Any process that doesn't require constant human interaction is by definition batch.
And I have said from the start that I do not have such a process, unless you consider a submitting a query to be a batch process, because the SQL Server engine is retreiving records without 'constant human interaction'. But if you do that, you could consider every single statement in any language to be a batch process, because the language execution engine performs that statement without such interaction. At that point, the term batch process loses all meaning.

You can't just arbitrarily put something in one basket or another.
No, and I'm not trying to. But it seems to me that you are.

And when someone complains about a long running process it is generally because they are trying to have a human interact with a batch process so no, I didn't not have any way of knowing that you were not making that mistake.
Except that I have stated repeatedly that I am not doing that. Yet you keep harping on it, just like you did about the non-existent asynchronous process you claimed I was spawning. I don't know what more to do, when I continuously write one thing and you keep claiming something else. I'm trying to solve a problem, not get involved in an argument.

If you don't like the popup solution, I'm not sure what else I can suggest.
I do like that, and I said so - I'm going to try it and see how it works. It seems a good trick, although I think it should not be necessary - Access should do what it's told and wait, queueing up mouse click and dealing with them when appropriate, without going off and generating spurious error messages about something that isn't even happening. But it's far from the only screwy thing that Access does. If this solves the problem, I will be reasonably content.
 
II BELIEVE but cannot PROVE that a click event is threaded to the front of the event-pending queue. (Can't prove it because Microsoft's Access code is not open-source so can't see how they actually do it.)
That would make sense as to what is happening. I don't think it's the full explanation, because for instance, you can't interrupt a stuck loop with a mouse click on the VBA Stop execution button, unless you have a DoEvents line somewhere in the loop. So some mouse clicks get ignored, although there is no wait in such a code loop, as there is when submitting a pass-through query. Maybe that's the difference.

By way of contrast, the actions that are caused by bound form status changes (OPEN, LOAD, CURRENT, AFTER UPDATE, etc.) are SYNCHRONOUS subroutine calls triggered when Access is performing those implied actions for you. However, Access treats them in a very similar fashion. Every action has a "hook" in the form of an event routine entry point that you can declare to insert your own code into the sequence. Every event is NOT called immediately on execution of the related action. What really happens is that the front event of the event queue is executed next. That is, all events are linearized. From this discussion, it should be obvious that one event cannot interrupt another event, because they use a queue structure to keep track of what they are going to do.

By using unbound forms, you make the form's status-change events short-lived, which means you return from YOUR event code to the Access GUI code (which is invisible to you). Those button clicks CANNOT interrupt a running event routine but if you are just sitting at a screen where you have launched a back-end async process and have then returned from that process launch, you are not running event code. (Access is running GUI code.) So clicking the button will immediately queue up an event as described earlier, and because no event code is blocking it, those pesky little button clicks will make it through.
Hm, that makes sense. I have only recently started playing with custom event code. Maybe if I had more ecperience with that, it would be clearer in my mind how this all stacks together.

But submitting a pass-through query and waiting for the results isn't really returning from that process, is it? Yes, Access is not directly involved in what SQL Server engine is doing, but when I specify 'Returns records' in the query, and give it a timeout value, I am telling it to do something - something very specific: "Send off this command and wait for the results, bringing me back the results when you get them." It does not return from that command until it has the records. And even when the user clicks something that generates this error message, when the Ok button is clicked, it continues right there where it was, having by then received the requested records.

Agreed. So I have explained exactly WHY you are seeing this behavior. Your solution is to find a way to continue running event code while waiting for the process to complete. Just because you have launched a back-end query on a "smart" SQL engine doesn't mean you have to return from the routine that launched it. What you want to do during that time, I cannot say because that would be your choice. But the solution to those unwanted button-click executions is to run event code, because one event cannot interrupt another event.
Do you mean if I launched a custom event to do the SQL Server call, and had THAT code waiting for the response, my code would then be immune to user clicks? That also sounds like an interesting way to do it, and not one that would have occurred to me. It might even look cleaner to the user than a pop-up form. I'll try both and see what l like better.

EDIT: If you have two forms open at once, neither of which is bound, it is NOT CLEAR whether you can run processes under both forms. However, if only a single form is active, then an event on that form cannot interrupt another event on the same form, and that is because Access DOES NOT USE the API call that creates what is called a "FORK" process (parallel execution thread).
You've lost me here. I know I can have several such forms open at once, and all can have code in them, but Access is single-threaded (as far as I know), so I can only ever have one active at a time. Maybe I am not understanding what you mean here (entirely possible, I am far from an expert on Access internals), but it seems to me you can have processes under as many forms as you have open, just not executing simultaneously. Or am I missing your point completely?

Note that for the purposes of this discussion, a TRAP handler is NOT event code even though you define it in what appears to be event context. A trap CAN interrupt event code (but does not interrupt another TRAP routine). I include the note only to forestall questions about interrupts.
That's a little over my head. Sorry, but I don't even know what a TRAP routine is, unless I know it by another name and am not aware of that.
 
Is the button on same form? can you Disable all buttons until the process is completed?
I could, but there are MANY buttons on the form, as well as a bound subform, also with buttons. Constantly disabling and re-enabling everything would be a monumental PIA, as well as probably making the form flicker a lot. I'm trying to avoid that.
 
you can create a sub/function to Disable/Enable all command buttons:
Code:
Public Sub EnableDisableButtonsOnForm(ByRef frm As Form, ByVal Enable As Boolean)
' arnelgp
'
' purpose:
'   Enable/Disable command buttons
'
' note:
' to disable, passed False to Enable parameter, otherwise pass True value
'
    Dim ctl As Control
    For Each ctl In frm.Controls
        With ctl
            If TypeOf ctl Is CommandButton Then
                .Enabled = Enable
            Else
                If TypeOf ctl Is SubForm Then
                    Call EnableDisableButtonsOnForm(ctl.Form, Enable)
                End If
            End If
        End With
    Next
End Sub

to use:

'disable all buttons in the form and subforms
Call EnableDisableButtonsOnForm(Me, False)
' some query or process here
'
'
'enable them again when done processing
Call EnableDisableButtonsOnForm(Me, True)
'
 

Users who are viewing this thread

Back
Top Bottom