Quick Question: What's Faster - Requerying data or iterating through a query (1 Viewer)

Swatkins

Registered User.
Local time
Today, 13:07
Joined
Oct 19, 2012
Messages
42
I feel like I've seen this addressed here before, but couldn't find it. I'm curious what the experts here think. (This may be a VBA question rather than a query question, but since it deals with queries I thought I'd put it here first.)

Before I spend a lot of time designing a suboptimal solution, I'm hoping to focus my effort on the best method. So I'm wondering what processes faster: Requerying a table based on a narrow set of criteria multiple times for mutliple sets of narrow criteria and processing the data before querying again, or querying once on a more general set of criteria and iterating through the resulting recordset and processing that way.

Many thanks!
 

spikepl

Eledittingent Beliped
Local time
Today, 22:07
Joined
Nov 3, 2010
Messages
6,142
The query optimizer would theoretically win hands down over a sloppy programmer. (But, a sloppy programmer can make queries run slowly too.)

If you are a sloppy programmer, then let SQL handle as much as you can. If not, then what's best for your mix would depend on the specifics. If execution speed is a real concern then test it. If this is merely a theoretical issue, i.e. whether it, once a week, runs in 0.0001 or 0,0002 ms, then speed is not a criterion so do what is the easiest to code for you.
 

Swatkins

Registered User.
Local time
Today, 13:07
Joined
Oct 19, 2012
Messages
42
Whether I'm a sloppy programmer or not I can't say for sure. But I expect it would be better to assume I was since if I don't know the chances that I am are probably pretty good...

Thanks for the response.
 

jdraw

Super Moderator
Staff member
Local time
Today, 16:07
Joined
Jan 23, 2006
Messages
15,396
I agree with spikepl and would further qualify any response with
- the query optimizer will make use of indexes if indexes are used
- the query may read your data multiple times and not use indexes if you have function that must be evaluated for each and every record.

Also, as spikepl mentioned --- try it and then tell us what you found along with the conditions involved.
 

namliam

The Mailman - AWF VIP
Local time
Today, 22:07
Joined
Aug 11, 2003
Messages
11,695
Depends not as much on what is best to do IMHO but rather in
- What are you trying to do
- What can you do
- What are your constraints, is a user sitting there waiting for results, or will a user go do something else meanwhile? Are there competing/comparable processes which make for "expectations"

for example, will it really matter if your query/process runs for 1 minute 1:10 or 50 seconds.
Or even 5 minutes vs 5:20 vs 4:30.... A lot will depend on the above 3 questions... as long as your process doesnt crash and burn usually even having an automated process work for an hour can be completely accaptable. Again it will depend on your contraints.

And the question of "requery vs itterate" is WAY to general, are we talking about 10 million records or 1000 records?
Big efficient and strong Oracle backend, or slow slug ugly foxpro solution?
Is your average query responsive or does it take 20 second for the query to run?
Is the source a production system that is already strapped for resources?
So many questions to ask, so mana parameters to consider....

Key things:
Make it work
KISS
Make it Maintainable (write comments and indent and if needed document)
KISS
Remember that it may be someone other than you having to look at and change your solution in the future.
And offcourse it has to actually HAS to work and KISS
 

Galaxiom

Super Moderator
Staff member
Local time
Tomorrow, 06:07
Joined
Jan 20, 2009
Messages
12,859
In my expereince, multiple querieng with narrow criteria is orders of magnitude faster than opening a larger recordset and using Find or Filter to get to the records required.
 

Swatkins

Registered User.
Local time
Today, 13:07
Joined
Oct 19, 2012
Messages
42
That's a nice list of caveats.

Some of the answers:

This is for a system I'm building from the ground up. Currently, the only records that exist are example records that I've created for the purposes of testing code and queries. So I'm working with a fairly small set of data.

When and if this project goes live I expect there will be a great deal more rows of data; my best guess being that it will be expanding into multiple thousands of rows fairly quickly. However, I'd probably only be working with a few hundred records at a time at most whenever I'm processing data.

Because this is home-grown and I don't have immediate access to (or the skills to wrangle) some heavy-hitting DB backends, this is all going to be in Access.

Myself and a small handful of coworkers will be the only ones using the system, but none of us have especially powerful machines for doing large scale data processing. (The story of my career to date.) The system I'm building is intended to replace a highly manual and Excel-heavy workflow that relies on Excel models with dozens of worksheets and countless lookups and other calculations - which can take upwards of ten to twenty minutes, sometimes more, to fully calculate. This is the reason for my interest in a time-efficient solution.
 

namliam

The Mailman - AWF VIP
Local time
Today, 22:07
Joined
Aug 11, 2003
Messages
11,695
A lot of calculations though shouldnt be done in Itterated processes... Some VBA may be needed/usefull but should be triggered "on case by case" basis in queries when the need arises unless the calculations are indeed that complex that doing them on the fly with a user waiting should result in 10 minute wait times.... Which if you build your DB (even semi) properly should never be the case.
 

Users who are viewing this thread

Top Bottom