Basic Advice Needed

Space Cowboy

Member
Local time
Today, 13:18
Joined
May 19, 2024
Messages
245
I have a huge query and I do not want to run it every time I want to look at it or call on it. If it is saved as a table then will the data be permanent and thus eliminate updates everyday. I may only update it once or twice a week or so whats the best way to look and call this data?
 
Where is the query getting the data from? How many tables are involved?
 
As long as you are happy and prepared to use slightly out of date data, then running a query once and saving the results is a fair compromise.
 
Set up a seperate database for this. Link all the tables it needs to be made to this database, then make another table called SystemInfo that has one Datefield called 'LAstUpdate'. Bring in your MAKE TABLE query that builds your table and also make an UPDATE query that updates SystemInfo to the current date. Then when you want to update your table, run both those queries. Have your data table live in that database and any other database that needs that table you link to it.

That's the database/technical side of things. The real-world human side is to set up a schedule and stick to it--twice a week once a week, doesn't matter as long as its a process. In fact you can even automate this with a macro and Windows Task Scheduler such that it automatically builds your table every night at midnight.
 
If it is huge you may want to create your temp table in an external database in what is referred to as a "side-end" database. This way you do not have to worry about bloat

@Galaxiom may be able to provide some code. I do not have an example. Basically you are creating a fresh external table each time and do not have to worry about compacting when you update. A slight modification of what @plog is suggesting. But instead of updating the table you are disposing of it.
 
If it is huge you may want to create your temp table in an external database in what is referred to as a "side-end" database. This way you do not have to worry about bloat

@Galaxiom may be able to provide some code. I do not have an example. Basically you are creating a fresh external table each time and do not have to worry about compacting when you update. A slight modification of what @plog is suggesting. But instead of updating the table you are disposing of it.
Maybe the download here still works.
 
I actually had @theDBguy code bookmarked, but forgot to check. Here is some more conversation on this issue.
The @theDBguy solution is the better solution to create and delete the new db not just the table.
 
Where is the query getting the data from? How many tables are involved?
I am getting the data from a Db that has around 30 tables of between 10-50 fields. some fields 10,000 rows some 200,000+, some 2million
I found that I am using 6 or 7 tables regularly and a 10-12 fields from each of those, max, from the totality to answer my queries.
I cannot slim down the tables to just the fields I need. I suppose I could query the table fields and then use that as a separate set of base tables.
Does that make sense or is it gibberish?
IS there even a benefit?
Why would setting a new DB be better than incorporating new tables into the current DB?
Apologies if these are daft questions.
I have had to slim down the query to 35,000 lines or will not run. Is it possible I am near the size limit?
 
Last edited:
Your issue with this query is not answerable without knowing a little bit more. When you generate the result-set of this query, how many fields are in it and how big is one record in bytes? I can make educated guesses on that.

I suspect your 35K lines limit is because of a query limit. In this article, MS says the recordset size that can be developed by a query is 1 GB.


If you have 35000 rows x 12 fields x 7 tables that is just under 3 million fields. Your maximum record size could easily exceed 4K/record if you had lots of 255-byte fields, but a query that did that would never run so I have to assume it is smaller. But if you have 35000 rows x 84 fields (derived from 7 tables) x 300 bytes per record (not that difficult for 84 fields), you would approach the 1 Gb recordset limit.

Why would setting a new DB be better than incorporating new tables into the current DB?

Because of something called "bloat",you DO NOT want to continually create and delete records. This bloat (database swelling in size) happens because of a holdover problem called "garbage collection." Access, unlike some of the "big boy" databases, does not clean up on-the-fly. You have to do something called a "compact & repair" (C&R) to recover abandoned space. In fact, you didn't mention it, but I would lay odds that you are familiar with that activity. Your file grows in size even after you delete a bunch of stuff.

The "side-end" database is a file that you can create fully configured but empty. Put it aside in a staging area. Then, perhaps in a batch script beforehand, you can delete it and create a new copy of it from the copy in the staging area. Since Access links by file name, not by any kind of internal file identifier, the FE will never know that you replaced the SE database before launching the app.
 
Why would setting a new DB be better than incorporating new tables into the current DB?
I am assuming you did not read the links. If you are adding and deleting to your DB or even an external db you can serious bloat the db. That is a lot of data to start with. Lets say it is 80M. You do a delete and insert a several times and you may quickly by at 2G. You then have to write code to compact and repair or ensure you are doing it yourself.

IS there even a benefit?
Conceptually yes. Without knowing your specifics hard to tell. But if modern large applications often "data warehouse". They offload large amounts of data into non-normalized pre-processed tables that makes it very fast to report against. This way able to report without hitting the live application and bogging it down with expensive queries.
 
Thank You ALL for your replies and considered advice.
There is lots and lots of new terminology there, for me, which I do not yet fully understand.
I will need some time to re-read and attempt to digest all the information that you have kindly provided.
I am not being ignorant if i do not reply for a while, probably with more questions, LOL, so please bear with me.
 
Here is a sample database that includes the code you need to manage the copy process. When using a "side" DB, the best method is to run the code that copies the template and loads it at the beginning of the process. I never delete the template at the end. I leave it. This makes it far easier to test and in your particular situation, you do want the data to hang around for a few days. I agree with the suggestion to date the data. The example I posted doesn't include that feature. I may add it at some point because it is a good idea. For now, you need to do it yourself if you want that functionality.

 

Users who are viewing this thread

Back
Top Bottom