Live tables - how to only update on demand (1 Viewer)

cprobertson1

Registered User.
Local time
Today, 17:35
Joined
Nov 12, 2014
Messages
36
Good Morning folks!

I'm using Access as the front end for an SQL server (I'm unsure of the underlying configuration/architecture of that server - I just know it's address and how to pull data off of it)

In this front end I have a monstrous abomination of a query - 18 tables, 30 joins involving 6 subqueries, and a bunch of headaches.

But I finally got that side of things out the way and... I've hit another problem - while I was testing I was using a static copy of those 18 tables to avoid bogging down the server - now that I've got the query made up, I've hit a bit of a snag.

When I try to run that query using live tables (instead of the imported tables) it takes several hours to complete - and it reduces server performances while it's running (though it's not as bad as it might have been!)


My interim solution/workaround was just to delete the tables and re-import them - except when I do this I lose all my relationships between the tables - which will make developing other queries using this front end more of a hassle than it needs to be.

Is there a way I can have a group of local tables that pull data from the server and save it locally on command? (i.e an import - but without deleting the relationships).

The only solution I could think of (I haven't tried this out yet!) would be to use SQL to clear the existing tables and then re-import the data (um, I'm not actually sure how to do that in VBA but it can't be too hard to them) - but that seems like a cumbersome way of doing things - so before I dedicated myself to that method, I figured I better consult the forums!

Many thanks in anticipation!:p

--EDIT--
May also be possible to create the 18 linked tables, in addition to 18 static tables - and just clear the static tables and copy the data from the linked data into them (or, perhaps even a query to only insert new entries - though that might be more hassle than it's worth)
 
Last edited:

isladogs

MVP / VIP
Local time
Today, 17:35
Joined
Jan 14, 2017
Messages
18,219
a monstrous abomination of a query - 18 tables, 30 joins involving 6 subqueries, and a bunch of headaches.

That may be some sort of record but not one that you want to retain.

For complex queries based on code stored in SQL server, you should use a view (or possibly a stored procedure) in SQL server so the processing is done at source.
This should significantly reduce the load improving performance & reducing processing time.

Even so, you need to look carefully at what this 'abomination of a query' actually does & redesign it from scratch.
How many of the table & joins are actually need to get the required output.
Unnecessary joins and superfluous tables can both significantly reduce performance.

Also I assume that the nature of this query means it is read only - is that an issue?

EDIT:
Just spotted your edit which you added just after I posted:
By static tables, I assume you mean these have been imported as local tables into Access
If I understand you correctly, that's a bad idea as the data won't stay up to date
You should ALWAYS use linked tables from SQL server - you don't need both
 
Last edited:

cprobertson1

Registered User.
Local time
Today, 17:35
Joined
Nov 12, 2014
Messages
36
That may be some sort of record but not one that you want to retain.
There is so much spaghetti in that code that lady and the tramp got bored halfway through ;)


For complex queries based on code stored in SQL server, you should use a view (or possibly a stored procedure) in SQL server so the processing is done at source.
This should significantly reduce the load improving performance & reducing processing time.
I'm afraid I have very limited access to the server itself - read-only access to the back-end and that's it I'm afraid!


Even so, you need to look carefully at what this 'abomination of a query' actually does & redesign it from scratch.
How many of the table & joins are actually need to get the required output.
Unnecessary joins and superfluous tables can both significantly reduce performance.
Almost all the tables are needed I'm afraid :( - most of them are being pulled to populate various fields; ultimately it's aggregating work-times for every operation in every work-in-progress job, in every sales order line, in every sales order, factoring in the rates, currency, and grouping that by work-centre, work order number, and customer.

The end result is pretty unimpressive actually - but it is pulling data from pretty disparate tables

As you've said though - it's pretty messy - I'm redoing the code as we speak - according to this venn diagram I've scribbled, I can get it down to 15 or 16 joins, using 16 tables (including 2 subqueries)


Also I assume that the nature of this query means it is read only - is that an issue?
Oop, that's fine - it's for generating a report for the accounting dept so that they can add it to one of their many spreadsheets - so the read-only isn't a problem - but it does need to be accurate when they generate the report.



I've actually just changed a single join from a LEFT to an INNER join and it's sped things up immensely...

I'll get back to you shortly - if doing that in the other subquery speeds that up as well I might be onto something. As I mentioned, I'm re-writing the query just now - that's the two subqueries finished and running very in about a second - if the next stage works as well as these two do, I'll be able to use live tables without any bother.

Back shortly
 

cprobertson1

Registered User.
Local time
Today, 17:35
Joined
Nov 12, 2014
Messages
36
In general subqueries are slow even when optimally written
If you have several the result will be very slow.

Suggest you read the article by Allen Browne here: http://allenbrowne.com/subquery-02.html

Good luck

Righty-ho - I've managed to get it running off the live tables at a decent speed (15-20 seconds) using "only" 15 tables involving only one subquery and 16 joins.

Sheesh. That was way more trouble than it was worth!

Thank you very much for your help! Very much appreciated :D

Glad to have solved the source of the problem rather than just finding a workaround ;)
 

Minty

AWF VIP
Local time
Today, 17:35
Joined
Jul 26, 2013
Messages
10,371
Now that you have rationalised it - you may well be able to pass the query structure to your SQL Admin person and see if they can create it as a view.

I suspect once done on the server as Ridders mentioned it will run much quicker again.
 

isladogs

MVP / VIP
Local time
Today, 17:35
Joined
Jan 14, 2017
Messages
18,219
My record for improving code was a lengthy subroutine that originally took 35 minutes to complete. After rewriting each part it took 7 seconds.

Definitely worthwhile!


Sent from my iPhone using Tapatalk
 

isladogs

MVP / VIP
Local time
Today, 17:35
Joined
Jan 14, 2017
Messages
18,219
The other thing I suggest is to break down your query into several smaller queries which should run much faster.

My guess is that the total time to do several small queries in turn will be less than doing one humongous query
 

cprobertson1

Registered User.
Local time
Today, 17:35
Joined
Nov 12, 2014
Messages
36
Just rolled it out there for testing a few moments ago - and the query is running in under 5 seconds - excellent.

Just one last thing to do... I need to add a new column that groups the 40-odd different work centre names under larger department names...

There's no such field in any of the tables - so I need to do it manually (while leaving in the original individual work centre names as well!)

Oh, it gets better! I've been given the criteria from another report to use - the query for this replacement it is only 133 lines long - tiny! (It's just 10 nested "IF" statements with a huge number of "OR" statements in each one...)...

Just rolled it out there for testing a few moments ago - and the query is running in under 5 seconds - excellent.

Just one last thing to do... I need to add a new column that groups the 40-odd different work centre names under larger department names...

There's no such field in any of the tables - so I need to do it manually (while leaving in the original individual work centre names as well!)

But! Good News! I've been given a 133-line IF statement (okay actually it's 10 nested IF statements and a bunch of "ELSE" and "OR" statements) - that's going to be fun to implement tomorrow!

Anyhoo - thanks again for all the tips :) Catch you later!
 

Mark_

Longboard on the internet
Local time
Today, 09:35
Joined
Sep 12, 2017
Messages
2,111
Just rolled it out there for testing a few moments ago - and the query is running in under 5 seconds - excellent.

Just one last thing to do... I need to add a new column that groups the 40-odd different work centre names under larger department names...

There's no such field in any of the tables - so I need to do it manually (while leaving in the original individual work centre names as well!)

Oh, it gets better! I've been given the criteria from another report to use - the query for this replacement it is only 133 lines long - tiny! (It's just 10 nested "IF" statements with a huge number of "OR" statements in each one...)...

Just rolled it out there for testing a few moments ago - and the query is running in under 5 seconds - excellent.

Just one last thing to do... I need to add a new column that groups the 40-odd different work centre names under larger department names...

There's no such field in any of the tables - so I need to do it manually (while leaving in the original individual work centre names as well!)

But! Good News! I've been given a 133-line IF statement (okay actually it's 10 nested IF statements and a bunch of "ELSE" and "OR" statements) - that's going to be fun to implement tomorrow!

Anyhoo - thanks again for all the tips :) Catch you later!


On behalf of the department of redundancy department, well done! :D
 

cprobertson1

Registered User.
Local time
Today, 17:35
Joined
Nov 12, 2014
Messages
36
On behalf of the department of redundancy department, well done! :D

Phwhaha - whoops! It's been a looooooong day :cool:

Aw-dear - that's it, no more computers for me ;)

::double checks I've not written that twice::
::confirmed::
 

Users who are viewing this thread

Top Bottom