DAO Workspace Database Collection (1 Viewer)

dw85745

New member
Local time
Yesterday, 22:28
Joined
Nov 3, 2014
Messages
18
I have a number of question in this area. I will post them one at a time based on answers.

================
Question #1
================
If I run the following code,, the two databases are added to the Workspace Collection.
NOTE: GetPath is a function that returns the database Path.
The ENUMs used in function identify which database path.

'All three object variables are PUBLIC, but shown here for INFO ONLY Dim wrk As DAO.Workspace Dim dbDAO As Database Dim dbSrc As Database Set wrk = DBEngine(0) Set dbDAO = wrk.OpenDatabase(GetPath(PATH_DATA_DB, False) Set dbSrc = wrk.OpenDatabase(GetPath(PATH_DATA_HISTDB), False) >>> NOTE: The Workspace Collection, now contains two databases (indexes 0 and 1).

However, if I change the above code by either:
1) Changing the object variable dbSrc to dbDAO
-OR_
2) Add a line (Set dbSrc = Nothing) AFTER the database dbSrc is added to the Collection

ONLY the first database (ie. dbDAO above), is included in the collection..

=================================================
Can someone please explain why using the same object variable OR destroying the second object variable
after the collection is created, reduces the collection to just one item within the collection.

Thanks
 
Remember that Access is NOT an Open Source product. What we cannot see is the actual code implemented by the Set verb. Nor do we know what exactly is in the object or data structure being referenced. Therefore, this next explanation is part guesswork and reliance on the similarity of Windows to another operating system that was written by the same person (Dave Cutler) and that behaves an awful lot like Windows.

The "SET <object> = something" establishes a POINTER (in an object variable) to a complex object. It also increments a "usage" counter in the targeted object that gets incremented every time a new pointer is established to that object. When you open a database object as you showed in your code snippet, you load the object variable with a pointer (an address to) that information. Note that in many cases, if you have two object variables, it is possible for BOTH of them to point to the same object. This specifically happens in the case of "Old Outlook" - which can be a bit complicated if that happens. But regardless of the circumstances, you also increment the usage counter for that referenced object. This claim is a BIT of speculation but is based on the published MS VBA Language Specification of Apr. 24, 2014, section 5.4.3.9, "SET Statement". That chapter says VBA explicitly checks for things that have a WITH EVENTS keyword as part of the deal and disassociates from those events when disassociating from another object.

In the normal flow of code, if you do a object-variable.CLOSE, you decrement the usage counter. And as part of its automatic garbage collection, Windows looks for objects with a 0 usage counter. Either Windows or Access dissolves the object's actual structures UNLESS they have potential for a separate life. (Cases in point: Excel or Word application objects live on if not explicitly closed before you reset the object variable. Same for Outlook. Probably true for PowerPoint.)

When you do a "SET <object> = NOTHING" then I believe the SET verb actually does THREE things. First, tests whether the object variable was NOTHING. Second, in "not-NOTHING" cases, attempts to close the actual object. When it is something like a recordset or a database object, that works OK. Closing the object decrements the usage counter whether or not the actual object has its own life. The third thing is to load an address of 0 to the object variable. Pointing to address 0 explicitly IS pointing to NOTHING.

When you do a "SET <object> = something" AND the object already pointed to something else (re-use same variable case), then SET does a different third thing. First thing is the same: It tests for the object variable pointing to NOTHING. Then if it is not NOTHING, SET closes the referenced object (which decrements the use count). Third, SET loads that object with a pointers to something else. It is therefore possible that a SET would dereference an object, clean up after itself, and then reference another object from that object variable.

At first glance, all you see are changes to the contents of the object variable unless you are monitoring the WorkSpace collection. That collection is where a created object structure goes when you open it. It goes there because, of course, that is where dynamic active structures enter program memory. Remember, with Windows, if it ain't in memory, it does nothing and nothing happens to it. But the lifetime of that object depends on when amd how it is closed. The WorkSpace collection doesn't keep track of things that get closed. It only cares about things that are open, as tested by their usage counters. And it trusts the usage count to shown when complete dereferencing occurs. Stated simply, the database objects populate or vanish based on having a non-zero use count.

What I just described IS the behavior of OpenVMS, but people forget that Microsoft and Digital Equipment Corporation once had a teaming agreement, which is where WinNT got its paging dynamics algorithm, stack management scheme, and "Wolfpack" clustering. Digital got a copy of Windows that would run on a VAX. Not sure that it was a fair trade.
 
The Doc Man: Thank you for that excellent response.
FWIW I started out on a CDC 6400 myself in 1969,
While I recognize a certain amount of speculation is involved, based on my efforts to date it rings true.

I now post the background of my problem (as I perceive it)., and my second question

[Background]
1)
The program I'm dealing with declared a single global (public) object variable> Public dbDAO As Database
That object variable is used to open a single Access database.
That object variable is used throughout the entire program to interact with the Access database (e.g dbDAO.OpenRecordset).

2)
Because the database has grown extremely large, I decided it would be best to remove all data tables to another database.
This would leave the program tables in one database, and the data tables in another.

3)
I also decided that since the data tables are being moved to a second database, it would be advisable to give the user the ability to select "data" databases, in case the data database tables again grew extremely large.

4)
I opted to use the Accesses Workspace Collection to store, and give the user the ability to switch between databases.
A combobox was created to allow the user to select which database.
The reassignment of the object variable (dbDAO) -- if possible -- would allow the current code to be used, -- and no need to add and pass in a database parameter to each procedure calling whichever Access database.
Based on my posted code and your response >> Best Laid Plans Don't Always Work!.

[Question #2.1]
Since code reuse is the norm, and since the various DAO procedures to deal with databases can apply to any database, is there a way to use the current global object variable (here dbdAO) currently used throughout all procedures?

[Question #2.2]
If not, is there any solution, other than modifying every procedure to pass in a database parameter, so that one can access the ACCESS databases (e.g. dbDAO.OpenRecordset)?
 
Last edited:
You can set multiple db objects, don’t need to reference the workspace collection

On my phone so not sure of the exact syntax but something like

Set db1=application.opendatabase (db1path)
Set db2=application.opendatabase (db2path)

Might be currentproject rather than application

I decided it would be best to remove all data tables to another database
That is called splitting the database and is standard procedure for any application so not quite sure what you are trying to achieve
 
You can statically link to each database if your split is stable i.e. you don't do this frequently. There are system limits to how many files you can keep open based on file handle and other resource limits. However, there is also the idea of a "side-end" database that is private to each user but isn't initially connected. When you say "program tables" vs. "data tables" that suggests that you keep a lot of operational data in tables as well as the originally intended tables representing your application. These program tables can go into something called a "side-end" database, which COULD be as simple as a database file you create each time. Or, better, that you make a master copy of a fully defined but empty side-end, which you then copy to the user's PC and then open it. This would leave your "data tables" in their original back-end file.

The up-side of splitting off tables into separate DB files like this is that you can bypass the 2 GB size limit. Each file adds 2 GB to the limit. You face a limit of how many DB files you can open depending on which version of Access you are using. Check "Access Specifications" for your version of Access to determine those limits.

The down-side of splitting off tables like that is that you cannot exercise relational integrity constraints between tables that reside in different files. The R.I. definitions have no data-source field to identify which file contains the constraining fields. Oh, you can easily write queries that will work between two data tables if it came to that - but they would not involve anything to do with relational integrity.
 
CJ_London and The_Doc_Man: Thanks for responding.

CJ_London
My post, previous to yours (which they had a post number to reference) and the original code post, reflects having two database objects will NOT get me anywhere, since all DB procedures are hard coded to use>>. dbDAO. {whatever}. Unless you know of a way to assign any database to
dbDAO, it does NOT solve my problem.

The_Doc_Man
When you say "program tables" vs. "data tables" that suggests that you keep a lot of operational data in tables as well as the originally intended tables representing your application.
This is the case.
Another option I considered was have the data tables exist in an offline database, and then copy the data tables needed this session to the primary (online) database at startup. This takes away a lot of flexibility for the User, as they need to know what is needed at startup.
The UPSIDE is, only about a dozen are needed at any give session, with the rest historical, but need to be available.
This would eliminate the need for multiple object variables, AND modifying a lot of code (adding a DB parameter to procedures).
 
Last edited:
which they had a post number to reference
The forum has posting numbers. Your last one was #6 (take a look on the right side of your posting).
Mine is #7.
You can also get a link to that postings. The posting you meant is #3, right?
 
Unless you know of a way to assign any database to
dbDAO, it does NOT solve my problem.
Code:
Public Function dbDAO() As DAO.Database
   Set dbDAO = Whatever you like
End Function

Your whole approach appears to be rather unusual for an Microsoft Access project. The commonly used approach is to link tables from one or more backend databases in to the current frontend database either at design time or when the application starts. During the normal run time of the application you would primarily use just the current frontend (CurrentDb) to access data.
 
1. ALWAYS disambiguate your DAO/ADO objects since there is some overlap in the libraries. So when you Dim a Database use "AS DAO.Database" OR "AS ADO.Database". I believe there was only one version of Access that made ADO the default so you usually only have a problem if you added ADO code to a DAO database or DAO code to an ADO database.
2. As others have mentioned. You don't have a normal situation if you are not using bound forms and linked tables but still there is no problem with declaring your database object to be any database you want.

If all your tables are linked, then using currentdb works. Otherwise, you would need a separate object for each separate database.
 
More specifically on Pat's comment on CurrentDB...

CurrentDB always refers to the DB that is open in the MS Access window. As a matter of continuity of control, this is usually the DB that contains the code module actually making the reference. In split DB's, that would be the Front-End database that supports the zero or more back-end files (and side-end, if you have any) and ties them together. I don't believe you can ever actually close that FE DB file because if you did, you would drop back into "naked" MSACCESS.EXE with no App file, which I believe would be forced to exit. If you somehow activated a sub in another DB (which you CAN do), the program call stack would still have a deep-down reference to the original FE, which therefore (per earlier discussions about usage count) would still be marked as "in use."

Net result, in a fully split DB, the first DB in the WorkSpace and the target of short-cut name CurrentDB, is your FE DB and any other DB files are either statically or dynamically opened. If you use the appropriate calls to create a Linked Table entry in the local FE to an external DB, its name is local but its ".Connect" string points to the external file. In that case you DON'T have to qualify the DB for the external table reference - because in essence you have a local shortcut to it in the Linked Table entry.

If you want to take a more complex but fairly flexible solution, there is always the IN clause. Note: this is not the same as using IN as an operator, and there are articles to disambiguate the two usages.


Using this, you might not have blazing-fast performance but you can have a large number of external files. Eventually, if you overdo this, you might run into file-handle or scratch-pad memory quotas, but this exists as a valid method for using multiple external files without so much formality in the declaration thereof.
 
Thanks to all for responding -- Happy Holidays.
For some reason, posted responses got lost in my EMail client -- I didn't notice the line until today.
Had pretty much determined there wasn't a solution.

The_Doc_Man
The link to JET SQL IN states:
Use IN to connect to only one external database at a time.
The way I read this is ONLY one DB open at a time.

=====================
Regarding All other Responses.

1) I declare my GLOBAL object variable as:
Code:
dbDAO As DAO.Database.  (this distinquishes between DAO and ADO.

2) The problem as previously stated: dbDAO is hardcoded in all routines.
e.g. dbDAO.OpenRecordset.

3) sonicB function (post #8) might work.
I could expand it with a SELECT/CASE, and pass in an ENUM for Each DB.
Will have to test, BUT suspect, when I assign a new DB the original will be dropped.
This would require the function to be called in every procedures, since dbDAO is hardcoded.

4) Changing to CurrentDB, might be the answer.
I could do a Find/Replace and change
dbDAO.OpenRecordset to CurrentDB.OpenRecordset.
Then just us sonicB's function in KEY places -- when I call a different DB.

5) Reading Up on CurrentDB indicates it is 60 times slower.
For me, I believe a better solution is to copy over the Hist files I need to the
current DB when needed.
This will most likely occur just once at startup.
That way I can have the Histroical Depository of Data, but keep the Speed.

.
 
Last edited:
The way I read this is ONLY one DB open at a time.

Yes and no. What you REALLY can't do is create a single table that spans multiple files using native Access databases. The size limit for an Ac365 DB is 2 Gb minus some overhead structures to define fields and some database overhead in the form of system object tables. Some of the "big-boy" databases, ORACLE as one example, CAN distribute a table across files (or even disks). Access was ALWAYS intended as a small-shop database and as such was not meant for huge volumes of data.

The IN clause mentioned earlier is in FROM context identifying a table, which because of the no-span rule could not be used to name a sequence of files. Therefore, ... one file at a time for a table. But I honestly don't know if something like a UNION query could draw from multiple files, one per SELECT sub-clause, and presuming the implied tables to be compatible structurally.

Also, the FROM xxx IN... method doesn't require much work ahead of time, just you have to know (a) that the external file holds a compatible table and (b) you have to know the path leading to that file.

The bigger problem I would see is that Windows Garbage Collection has its limits. Using the IN clause implicitly opens and later closes a file, which consumes a file handle at least temporarily. From past member questions, I have some doubt about how long a single instance could run. At some point you might get an "Insufficient Resources" message because you ran out of file handles or something similar because Windows doesn't recycle the handles very well when using behind-the-scenes automation.
 
Do this in the immediate pane...
Code:
? CurrentDb Is CurrentDb
 False
CurrentDb returns a different instance on each reference. This is an advantage because a DAO.Database maintains state. If ConsumerA runs a query, the Database.RecordsAffected() property returns how many rows were affected. If you maintain a global instance, ConsumerB may check this property, and without error, return a result that was actually generated by a different consumer of the instance. That is a disadvantage. Similarly, if ConsumerB runs a "SELECT @@Identity" query, the result may be an ID generated by a different process in a different table.

I ran some timings, and on my machine, a call to CurrentDb takes 0.000146875 sec.

Also, if you run mutiple action queries, creating or editing mutiple rows, you can dramatically improve performance by nesting those within a transaction and committing them all at once, but if you do DBEngine.BeginTrans, your global instance, having been created outside the transaction, never affords you this advantage.

I would link BE tables to the FE, and these can be linked from multiple different BEs, and then replace your global database instance with calls to CurrentDb.

Putting an IN clause in your SQL is the slowest possible option because it needs to resolve the path to the resource in advance of any execution.

Here's timings code...
Code:
Private Sub TestCurrentDbSpeed()
    Const MAX As Long = 2000
    Dim dbs As DAO.Database
    Dim i As Long
    Dim clock As Single
    
    clock = Timer
    For i = 1 To MAX
        Set dbs = CurrentDb
    Next
    clock = Timer - clock
    
    Debug.Print "One call to CurrentDb takes"; clock / MAX
End Sub
 
Since you seem to not be using a normal database where the tables are linked and the forms are bound, perhaps you can give us some insight into what/why so we can figure out what problem we are trying to solve.
2)
Because the database has grown extremely large, I decided it would be best to remove all data tables to another database.
This would leave the program tables in one database, and the data tables in another.

3)
I also decided that since the data tables are being moved to a second database, it would be advisable to give the user the ability to select "data" databases, in case the data database tables again grew extremely large.
Based on these, I think you
1. need to link all the tables to the new database you created.
2. give the user a form where he can choose a different data database and your code will simply relink all the linked tables.

Typically, the user doesn't just randomly relink to a different BE so this is not an option I would make visible to all users. YOU, as a developer would normally be in control of relinking. Only if for some technical reason that the BE needed to be moved from serverA to serverB or the folder structure renamed would the BE EVER have to be relinked, so letting users have access to a feature like this is very dangerous.

Also, once you separate the tables from the original monolithic database, you need to establish proper distribution of the Fe so that only the BE database is shared. EACH user must have his own copy of the FE. The master copy is stored on the server and distributed either by using a batch file executed by a shortcut or a special distribution database. Both can be found here if you need them.
 
The Doc Man: Thank you for that excellent response.
FWIW I started out on a CDC 6400 myself in 1969,
While I recognize a certain amount of speculation is involved, based on my efforts to date it rings true.

I now post the background of my problem (as I perceive it)., and my second question

[Background]
1)
The program I'm dealing with declared a single global (public) object variable> Public dbDAO As Database
That object variable is used to open a single Access database.
That object variable is used throughout the entire program to interact with the Access database (e.g dbDAO.OpenRecordset).

2)
Because the database has grown extremely large, I decided it would be best to remove all data tables to another database.
This would leave the program tables in one database, and the data tables in another.
3)
I also decided that since the data tables are being moved to a second database, it would be advisable to give the user the ability to select "data" databases, in case the data database tables again grew extremely large.

4)
I opted to use the Accesses Workspace Collection to store, and give the user the ability to switch between databases.
A combobox was created to allow the user to select which database.
The reassignment of the object variable (dbDAO) -- if possible -- would allow the current code to be used, -- and no need to add and pass in a database parameter to each procedure calling whichever Access database.
Based on my posted code and your response >> Best Laid Plans Don't Always Work!.

[Question #2.1]
Since code reuse is the norm, and since the various DAO procedures to deal with databases can apply to any database, is there a way to use the current global object variable (here dbdAO) currently used throughout all procedures?

[Question #2.2]
If not, is there any solution, other than modifying every procedure to pass in a database parameter, so that one can access the ACCESS databases (e.g. dbDAO.OpenRecordset)?
Have you considered migrating your data to SQL Server? The free version, SQL Server Express, can hold 10 GB of data (the last time I checked it was still 10GB).

It is more robust than an ACE back end, and offers more options for processing on the back end, e.g. Stored Procedures.

Is there some impediment in your environment that prevents consideration of a server-based back end database?
 
I agree with George that if you really have a large volume of data then SQL Server may be best but baby steps. First you need to stand before you can walk.
step 1 = separate the data to its own database and link the tables
step 2 = create a proper distribution method for the FE so that the users each have their own personal copy of the FE
step 3 = create a modicum of security to protect the FE. No need to go hog wild. Just the obvious stuff to keep the user from accidentally getting into design view on anything.

The decision to move to SQL server is predicated on the actual size of the data to be stored and/or the number of concurrent users. Once you move the data to SQL Server, you are actually freed from many of the constraints and complaints regarding "Access". Now your data limits are what SQL Server will support rather than 2G per database. Your concurrent user count is now limited to the number of seat licenses your SQL Server will support rather than 255 technical but closer to 50 practical. Using SQL Server, you could support thousands of concurrent users and gigs of data.

Access is a Rapid Application Development (RAD) tool. You can switch BE's from Jet to ACE to SQL Server to Oracle, to DB2, to Sybase, etc by simply relinking. THAT is huge when you think of the awkwardness of how it is done with other platforms. Of course the first hurdle is Jet/ACE to RDBMS but once you make that first break, the rest comes easy. Most applications will require at least minimal modifications to process and DAO code when making the move. Most of my apps either WILL be converted or may be converted so I always build my apps with RDBMS in mind so if/when I need to upsize, all I have to do is to convert the data and then test thoroughly. The required RDBMS arguments in your DAO code work fine when the BE is Jet/ACE and the other efficiency techiques do also so you may as well code to your end database specs.
 
Thanks to everyone for their responses
------------------------------------------------------------------
Knowing What I'm Doing, Most Likely Will Help Responses
-----------------------------------------------------------------
The App is a real-time trading program, for ALL markets.
It was started over 30 years ago and has grown to say the least.
.
There are three default tables for Each Symbol (TICK, 1 Minute, and Daily).
Each of these default tables are built while in Realtime.
Three "default" table peiods were chosen to faciliate easy RollUp so the User could select any period they desired..
(E.G. If the Use selects 15 ticks, then a new 15 TICK table is created from the default TICKS.
Conversely, if the user selects over 1 minute (e,g, 15 minutes)) a new 15 minute table of that period is created from the default 1 minute data.
Lastly, if the user selects anything over 1 day, a new table is created using daily data. )

So depending on how many sysmbols each user is monitoring, and the number of periods being monitored, a lot of data is being collected just in the default tables. If one goes back 10, 20, 30 or 100 years then the tables get really large.

The above is just data tables.
There are a number of other tables used for tracking, charting, etc.

SO THERE IS NO CONFUSION: Both the ACCESS FE and BE are just tables and queries.
All forms, reports, and anything else is handled outside of Access by Creating/Updating ACCESS tables or by executing one of the queries.

I finished moving my data tables to another Access DB, but the limit is on the horizon.

-----------------------------------------------------------------
I'll respond to last posts first by Pat Harman and GPGeorge
-----------------------------------------------------------------
SQLServer (an/or Express) is an Option..

FWIW, I prefer to do test beds in Access since I'm most familar with it, and once the scheme is accepted, porting to another DB, IMHO
should be easier, especially if I'm learning that new DB "on the fly".

---------------
Mark K
--------------
I didn't personally do any performance test of CurrentDb. My 60 times slower, if I recall correctly came from an Microsoft website.

---------------
The_Doc_Man
--------------
General consensus seems to be "IN" will be slow.
 
Last edited:
General consensus seems to be "IN" will be slow.

Since it has to make the file connection through standard Windows file locking methods, then establish a connection to the database structures from somewhere inside the file's structure, then identify the fields that in the SELECT clause - all dynamically - and then gracefully back itself out of that set of connections and locks, did you EXPECT blazing speed? THAT is why you go through the trouble of linking and connecting ahead of time, and why ANY type of dynamic connection - whether done all at once via IN or done piecemeal via run-time linking - will be slower than any method using permanent/static connections.
 
The_Doc_Man:

So now I have a decision to make, stay with ACCESS or go with a new Db.

did you EXPECT blazing speed?
One Hopes for that. I always try and get things to work first, then optimize for speed.

---------------------- FWIW-----------------------
Data files originally started out as flat files.
When "Revelation" Db came out I moved to that.
"Revelation" disappeared,
DB3, I think bought it, and PARADOX came on line around the same time.
So I moved everything to PARADOX, and then ported over to ACCESS after that.
One thing I've always disliked about ACCESS -- in my situation -- is I need to build a separate query for each Symbol.
ACCESS won't allow variables in "stored" queries for whatever reason. Just remembered this issue.
Going to a new Db may solve that issue..
--------------- FWIW End ----------------------------------
 

Users who are viewing this thread

Back
Top Bottom