DB size?

Lock up

mdemarte,

Corruption started about 2 years ago after the install of a management accounting system. The system backend ran on its own server, but the front end install required a heap of Windows patches. The patches may have had nothing to do with it, but we had a corruption storm almost immediately.

We learned to deal with this, and got very quick at fixing the thing (using jetComp.exe) - down to about 2 minutes.

With time, we noticed that corruption would go in cycles, there would be a month of hell (4 corruptions / day), then two or three months of easy use.

Eventually, we reckoned a corruption cycles would end once the database got bloated enough that Access was effectively locking individual records anyway. This is why we started restoring from a backup then piping missing records in. After that, we would get corruption maybe once every three months or so.

There are no doubt Access locking issues we haven't addressed. But dodgy network connections, one or two very slow PCs, users who have many applications open (8 + 10 Word documents + 3 Excel Workbooks), and Ctrl+Alt+Dels seemed possible cuplrits too.

Then we had a nasty server crash (air conditioning failure in the server room on a Sunday night). Although everything was up and running for first thing Monday morning, the database corrupted almost immediately. This seemed to be a particularly nasty corrupt, and JetComp.exe would no longer fix it. We restored a backup, managed after several attempts to rebuild a copy of the corrupted version (on the Citrix Server) and piped the missing records in.

Since then we've abandoned JetComp, and take at least one backup (automatically) every day.
 
I have no idea how come this discussion wound up in this thread but I do have an idea for the corruption issues. Given the scenario you have, I would create a log table and record when and where users log into the database. The, maybe, when the db seems to be in a series of corrupting cycles, you could isolate a user / workstation? (Just an idea)...
 
To sort of go back to the original question - what kind of impact would the size of a access FE have on its performance.

How important is it to get rid of redundant forms , queries etc.


Thanks
 
(Just my humble opinion) I don't think the number or latent objects are that important (As evidenced by Mike375's db). What has crushed one of my db's was the number of relationships and the complexity of the active form. One form in particular had to work with data in about 20 related tables and it had numerous pages and subforms, etc. It was (is) a real dog. And there wasn't that many records; prob. 8-10k. But, another db has a form looking at two related tables with over 250k records with up to 10 user hitting it at the same time and it cruises... Go figure..
 
Pauldohert said:
To sort of go back to the original question - what kind of impact would the size of a access FE have on its performance.

How important is it to get rid of redundant forms , queries etc.


Thanks

I will tell you the negatives mine has because of size, but remember this is Access 95 so your mileage might differ. Actually, I think it is not size but number of objects.

Firstly, and this is no real big deal, if you attach a macro to say the OnClick of a text box or label and if the computer is low speed, then it takes a while for the macro to "appear". The reason is simply that the drop down list in my case has 2000 of them.

In the expression builder box you can't bring up the pre packaged functions like IFF etc. That caused me a bit of inconvenience some years ago. What I use to do was to make the macro or query in a new .mdb file. If I was altering one then I would export the query and its table of the macro. These days when I make the odd changes to the DB that does not affect me because about any and every function I need has already been made in the DB so I just open a query or macro in deisign view and copy one out and then just change the field names or other details.

The other problem with the large .mdb file, but probably not an issue these days, is copying it onto Zip drives etc. I use to have Zips that were 100 mb and I always had to compact my DB to fit the thing on.

I agree with the points that KenHigg has made in that some things seem to happen without reason.

I don't think performance is altered except that the data base window opens a tad slower opening. I have several .mdb files that have extracts from my main DB and there is no performance difference when running on the small .mdb (some are only 2 or 3 mb ands with a couple of tables, a few queries and a few macros) as compared to running in the full data base.

In my humble opinion :) I think when you are starting out you are better to leave a lot of the objects that have been discarded. Some of them can be handy a bit later as an "instruction manual". My data base went through a virtual remake in 199 but I still kept the DB that had been developed between 1996 and 1999. On the odd time or two I return to it see how I did something.

One thing I would advise is to keep making copies of what you are making. Access will on first copy will create a file Copy of DBName then do copy again and you will get Copy (2) of DBName and so on. By doing this you can never slip back behind where you were.

Ultimately, I think the size of your DB is determined by your own personality and of course uses. A friend of mine in the insurance business is quite competent with Access and also Excel. He has lots and lots of smaller .mdb files as in my opinion he is very spreadsheet minded and a very compartmentalised person.

Mike
 
Mike375 said:
A friend of mine in the insurance business is quite competent with Access and also Excel. He has lots and lots of smaller .mdb files as in my opinion he is very spreadsheet minded and a very compartmentalised person.

In our opinion, you are very spreadsheet minded.



And have to be taking the piss. ;)
 
Mile-O-Phile said:
In our opinion, you are very spreadsheet minded.



And have to be taking the piss. ;)

I understand why you say that.

However, while I have many tables the data does funnel back to very few tables. I consider my data base to be like a restaurant that serves all types of foods and you can get all the foods at your table. However, the restaurant has a separate kitchen for each of the different categories of meals.

But I was not referring to a spreadsheet mentality in terms of how you make something but rather lots of.mdbs for lots of different things. Sticking with the restauarant analogy this friend of mine needs to go to a different restaurant for each category of meal. :D

Mike
 
Mike375 said:
I understand why you say that.

However, while I have many tables the data does funnel back to very few tables. I consider my data base to be like a restaurant that serves all types of foods and you can get all the foods at your table. However, the restaurant has a separate kitchen for each of the different categories of meals.


Mike

don't ever become a chef :rolleyes:
where do you eat at anyway?
 
Rich said:
don't ever become a chef :rolleyes:
where do you eat at anyway?

I use to almost live two restaurants called the Seafarer which was on Sydneys lower North and the Blue Swimmer on Sydneys South. "Use to" for two reasons. One reason was that I was very home bound for a couple of years due my mothers health and reason two is that smoking has been banned in restaurants.

Mike
 
Mike375 said:
two is that smoking has been banned in restaurants.

Mike
so you just barbi the food now so the chef still get's ash over everything :rolleyes:
 

Users who are viewing this thread

Back
Top Bottom