How to Upload a Stripped DB. (1 Viewer)

pr2-eugin

Super Moderator
Local time
Today, 14:39
Joined
Nov 30, 2011
Messages
8,494
Hello there,

I have seen this many times across the Forum, when some of us ask users to upload the DB.. The responses we get are "The information is sensitive", "The file is too big", "There are over 100,000 records"..

Well when I started using this Forum, I did not know how to UPLOAD a stripped DB.. So here is the information on how to create a Stripped DB.

Thread : How to Upload a Stripped DB.

To create a Sample DB (to be uploaded for other users to examine); please follow the steps..
1. Create a backup of the file, before you proceed..
2. Delete all Forms/Queries/Reports that are not in Question (except the ones that are inter-related)
3. Delete auxiliary tables (that are hanging loose with no relationships).
4. If your table has 100,000 records, delete 99,990 records.
5. Replace the sensitive information like Telephone numbers/email with simple UPDATE queries.
6. Perform a 'Compact & Repair' it would have brought the Size down to measly KBs..
7. (If your Post count is less than 10 ZIP the file and) Upload it..
Finally, please include instructions of which Form/Query/Code we need to look at. The preferred Access version would be A2003-A2007 (.mdb files)
 

George-Bowyer

Registered User.
Local time
Today, 14:39
Joined
Dec 21, 2012
Messages
177
I wish it was that easy... :(

My DB is relatively small, only a few thousand records.

However, it is all about relationships. It stores information about people, organisations, positions and regions.

People can have positions in regions, organisations, clubs, sub-regions; clubs can have positions in associations, sub associations, regions etc, etc, etc. Some people don't have any positions, some clubs don't have any members - it's a real spider's web.

It's easy to say "If your db has 5000 records, just delete 4,990 of them", but working out which 10 to leave that are going to have the right relationships to demonstrate not only what the problem is, but also how it should work properly - that is nowhere near so simple...
 

isladogs

MVP / VIP
Local time
Today, 14:39
Joined
Jan 14, 2017
Messages
18,186
In that case, remove all current records and replace with a small number of dummy records which will be sufficient to illustrate any problem you may have
 

MajP

You've got your good things, and you've got mine.
Local time
Today, 10:39
Joined
May 21, 2018
Messages
8,463
 

George-Bowyer

Registered User.
Local time
Today, 14:39
Joined
Dec 21, 2012
Messages
177
Hmm. This could turn into something of a circular discussion. I can only go back to "it really isn't always that simple" (although, of course, it may well be to access experts, but that's slightly different…)

In a situation where I have an incredibly complex series of relationships which somehow are interacting to cause an error, in such a way that I don't know which bit of the process is generating the error, working out which real records to leave in is complicated enough; generating dummy info which is going to replicate the problem is even more so.

I will give a recent example: my db produces a report that goes to printers to be made into an annual hard-copy directory of clubs and officers.

For some reason, the report was excluding a very small and seemingly completely random number of officers (from a list of around 300 clubs with around 5-10 officers in each). They were there on the forms, but missing in the report.

Submitting a database for examination with only 10, or even 100, records might not trigger the fault in that report at all.

Likewise, replacing the records with scrambled text would make proofreading the report to find any errors insanely difficult.

I was genuinely not trying to be difficult, or a smart-alec, with my response above; I was just saying that it's not always that simple.

Regards,

George

(Incidentally, in my example above, it was a clash of two formatting processes in the report which was causing officers to be omitted whose email addresses were exactly a certain number of characters long. As I had no idea that that was the cause of the problem (and only discovered it by luck), replicating it with dummy data would have been almost impossible.)
 

gemma-the-husky

Super Moderator
Staff member
Local time
Today, 14:39
Joined
Sep 12, 2006
Messages
15,613
What you REALLY want is a split database.
Then you have the real data in one back end, and a stripped down set of test data in another, and you can send the test data without worrying about size or confidentiality.

Of course, deciding whether you want to send your code out is a different matter, irrespective of whether your data is split or not.
 

George-Bowyer

Registered User.
Local time
Today, 14:39
Joined
Dec 21, 2012
Messages
177
Heh. The only reason I might not want to send my code out is to stop people from laughing at me...
 

MajP

You've got your good things, and you've got mine.
Local time
Today, 10:39
Joined
May 21, 2018
Messages
8,463
@George-Bowyer,
I think what you are saying is true, but it is probably the 5% problem. 95% of the time just looking at someone's tables without data or limited data I can see the significant issues with the design. Very rarely does the data (unless null values) cause a specific issue.

If in fact if the db is failing based on the different values of the data, normally there is a bigger problem. The query or code is looking for specific numeric or text values in a field, which often can be fixed with a better design.
1. Store data in tables and not code
2. Use numeric PKs
3. Enforce referential integrity
4. Use proper joins
5. Require fields
6. Use default value for fields
...
 

Users who are viewing this thread

Top Bottom