Junction tables not matching related table after creating new backend (1 Viewer)

Jupie23

Registered User.
Local time
Today, 17:46
Joined
Nov 9, 2017
Messages
90
I have a split database with a main table and 2 junction tables that should be related. I was having trouble with blanks in my table so I decided to make a new backend and remove the auto name correct and clean up a few things. The main table has over 30,000 records in it so far, and I was thinking that I would just switch the team to the new back end, and then copy over the records that were entered in the old one while I was working on the new back end - about 300 hundred records. I didn't even think of the fact that when I copied the records from the main table over to the new backend, the autonumber (RecordID) changed, and no longer matches the RecordID on the 2 junction tables. I exported both to excel to compared the new ID to the old one, found that number on the junction tables and updated it, and then copied that data back into the junction tables. I forgot to check my relationships to the tables and went to set them up afterwards, and it won't let me enforce referential integrity because it says there are unmatched records. It is turning into a giant cluster. How do I fix this?! I still have the original data, it just has different IDs now. Any ideas??
 

plog

Banishment Pending
Local time
Today, 17:46
Joined
May 11, 2011
Messages
11,645
You need to make a mapping table which contains the old ID and the new ID. Once you have that you can use it to update the junction tables from the old ID values to the new ID values.
 

Pat Hartman

Super Moderator
Staff member
Local time
Today, 18:46
Joined
Feb 19, 2002
Messages
43,257
Start again.

When you copy the records from the old table to the new, you MUST use an append query so you have to link to both databases. An Append query is the ONLY method available for preserving the original autonumber PK if you are doing the transfer yourself.

An alternative is to create a new, empty database and import the old tables. This method also preserves the autonumbers.

Cut and paste will generate new autonumbers.

You should never be working on the BE while users are actively using it. Once you finish changing the structure, you are going to have to transfer the data again and this time it won't be a straight copy so you will have to use the append method I suggested.
 

Solo712

Registered User.
Local time
Today, 18:46
Joined
Oct 19, 2012
Messages
828
I still have the original data, it just has different IDs now. Any ideas??

Are you saying you don't have a backup of the big table with the old IDs ? Because if you don't you are in a real fix. If you do, I would take Pat's advice and start over. That would be step A.

If I understand this correctly, that would still leave you with the issue of re-linking the junction tables to the "new" backend table (the 300) after you inserted the new entries. That is not difficult. You should rename the interim backend table and then use it a) to insert into the new backend and b) it as reference when relinking the junction tables for the entries made in the interim, while you were working on the cleaning up the old backend. Relinking the junction tables to the new autonumbers is a simple one-join update query. If you send the structures of all the tables involved, we'll take a crack at it. But that all presupposes that you have the old database with valid relationships backed up, i.e. you can get through step A.

Best,
Jiri
 
Last edited:

Pat Hartman

Super Moderator
Staff member
Local time
Today, 18:46
Joined
Feb 19, 2002
Messages
43,257
Once you make a copy of the BE to work with it - FORGET about ever using its data.

Always make a backup of the BE BEFORE you startworking on it. I make two ans zip one of them to avoid accidents. Then if you want to do data cleanup, do it while the database is offline. Weekends and evenings are best. Some data updates can be done while users are logged in but you don't want to cause conflicts by running update queries while the users are updating individual records. Access will prevent the data from being corrupted but the users will get virtually unintelligible error messages so don't do it unless this is an emergency.

Do the cleanup is small pieces. That way you won't have to work nights as well as days. Make incremental changes each day. Make sure you set RI correctly to prevent blank records from being added in the future. This is a symptom of not doing proper validation or of doing it in the wrong form events. The BeforeUpdate event is the event where most of your validation should go.
 

Jupie23

Registered User.
Local time
Today, 17:46
Joined
Nov 9, 2017
Messages
90
Hello, thank you for the responses! I am self taught with Access so I struggle sometimes with figuring out some of these things. I have the old backend dated 6.6.17. It contains data from 10/2/17 to 11/14/17. I imported that table into the new back end on 11/13/17. The new backend contains data from 10/2 to 11/13, and then I switched the team over to use it from 11/15-11/17. So part of 11/13 and all of 11/14 is missing from the new backend, but is still in the old one. No one uses it on weekends or at night so I am ok to work on it now. Maybe I should just start over with all this. Are you able to give me the correct steps to follow to make a new backend and transfer data from both places into one?

I have a few other issues I'm noticing but should those be separate threads? I'm getting some blank records on the table and the team is getting network interruption errors sometimes. Not sure if those are related.

I also have coding on the after insert event of my form so they can select multiple items from 2 listboxes, which saves to their related junction tables. It has been working well, but out of 30,000 records, 75 records did not save to the junction table. Those may be things that can be fixed later, but I wanted to mention them in case they needed to be corrected before putting data in.

I appreciate any help you can offer. I really don't want to make a database that has long term issues. Thank you so much!

Edit: I have created a new blank database. I have 6 tables that contain informational items where comboboxes and listboxes pull from, that don't change and aren't affected by this. I imported those with their data. I imported the structure of my main table and two junction tables, but did not import the data. I have turned off subdatasheets and auto name correct and set up the proper relationships. Anything else I should do to start?
 
Last edited:

Pat Hartman

Super Moderator
Staff member
Local time
Today, 18:46
Joined
Feb 19, 2002
Messages
43,257
If people were adding data to both BE's during the time period, there will be autonumber conflicts and you will NEVER be able to merge the data automatically. You are going to have to look at every conflict separately and decide which data is correct. Good luck with that.

in the future, never copy the BE to work on off line and expect to merge it back. If you can't make the changes in a short period of time in the evening or on a weekend, do it a little at a time. OR, make schema changes to your copy of the BE and then delete ALL the data. Link to the tables of the original BE and run append queries to copy the data from the old schema to the new one.

If you are possitive that both copies of the BE were not being changed at the same time (ie. YOU didn't make data changes), then you can make the copy with the schema changes the master, delete all the data and append the data from the user's copy to your copy. Duplicates will not be copied so every append query will give you an error message.

If your insert code isn't working, you are going to have to figure out why. At least for now, add code that validates that the new records actually got added.
 

Jupie23

Registered User.
Local time
Today, 17:46
Joined
Nov 9, 2017
Messages
90
Yes, I realized how stupid that was the next morning. I will not make that mistake again! People were entering data in the old one up through 11/14. The new one only had data copied over up to 11/13, and then on 11/15 they started using it. So everything from 11/14 that needs to go over to the new one, those autonumbers have already been used by the 11/15 data. I will have to fix them manually and chalk it up as a learning experience.

How does the append query work with the junction tables? Would my query send to the main table as well as the junction tables if the fields from all tables are in the one query?
 

Pat Hartman

Super Moderator
Staff member
Local time
Today, 18:46
Joined
Feb 19, 2002
Messages
43,257
The only time I use an append query to append data to a junction table is if the junction table is something like a survey. A new userServey header record gets added and I run an append query to copy all the survey questions from the survey definition talbe to the userSurveyAnswer table. In all other cases, I use a subform and let the user choose one FK himself from a combo box. Access automatically ads the other FK based on the master/child link to the parent table. So for example, If I wanted to add a student to a class, the main table would be Class and the subform would be studentClassRoster. The class FK is dictated by the record on the main form and the student is chosen from a combo. If I wanted to add classes for a student, the Student tablle would be the main form and the studentClassRoster would be the subform except this time the master/child link would be Student and the user would pick a class from the classes combo.
 

Solo712

Registered User.
Local time
Today, 18:46
Joined
Oct 19, 2012
Messages
828
Yes, I realized how stupid that was the next morning. I will not make that mistake again! People were entering data in the old one up through 11/14. The new one only had data copied over up to 11/13, and then on 11/15 they started using it. So everything from 11/14 that needs to go over to the new one, those autonumbers have already been used by the 11/15 data. I will have to fix them manually and chalk it up as a learning experience.

How does the append query work with the junction tables? Would my query send to the main table as well as the junction tables if the fields from all tables are in the one query?

Well I think Pat takes a little too pessimistic view but the crucial point is, do you have the a backup of the old backend table with autonumbers that were used as foreign keys to the junction tables up to the switchover. If not, then there is really no hope. If these numbers were preserved in the switchover, then the next question you need to ask is: how much data has been compromised and is it worth while to convert the messed-up junction tables by a procedure. If you are sure on both counts then, the procedure itself is doable and consists of these steps:

1) cut the existing back-end into two files at the point of transfer (at 11/15, if understand correctly). The old part is assumed to have had its references preserved. The new back-end we will called "messed-up". Put the old table in some safe workspace and call it "master".

2) Append a record to master from the messed-up part of the table (11/15-17). I recommend doing this one by one in VBA as the SQL here gets involved (four tables at minimum). After each record was inserted, roll back setting the bookmark to .LastModified. Extract the new autoincrement ID.

3) Now, you need to know the cutoff for the new records in the junction tables (by Primary Keys), i.e. those made from 11/15 onwards. You do this then for both junction tables: Find the first match of the "messed up" record ID with the respective junction table. If it is a new junction PK, then simply replace the messed up FK, with the new autoincrement ID. If it is a junction PK that is lower than the 11/15 cutoff, you are looking at duplicate with an old record. May be you won't ever see this but if you do, ignore it. Only the new junction table records need to have their FKs replaced.

4) move to the next record in the "messed-up" table and repeat the procedure in steps 2 and 3.

Of course, I can't give you more than a high-level approach since I haven't seen the structure of the tables and the insert procedures, but I am confident it can be done as long as the big part of the table had its references preserved. Best of luck whichever you choose to go. And don't feel bad, we all had done things we wished later we hadn't.

Best,
Jiri
 

Jupie23

Registered User.
Local time
Today, 17:46
Joined
Nov 9, 2017
Messages
90
Yes, I do still have the old backend data with the autonumbers preserved. It has records through the end of 11/14. The new backend started being used on 11/15 to the present. I compared the autonumbers from 11/14 to the autonumbers on 11/15 and there are 831 records with the same autonumber. 117 of those should appear on SQ Junction, and 279 should appear on Checklist Junction. So 396 of them need updated autonumbers, and the rest of that 831 don't matter, they can have any autonumber. At this point I am just totally confused. Your steps sound good, but #2 is a little over my head. Can you give me more details on how to do that? I don't have the option yet of attaching anything here to show you my tables.

At the end of the day when no one is using it, should I just append the records to a new blank database from the current backend that contains data up to 11/13 and then 11/15-now, and then get the 11/14 records from the old back end. Make a query to show the ones that are not in either junction table and copy and paste those over, and then re-enter the remaining ones from the front end? It would be a lot of work to reenter them, but at least I'd know they were correct then. Ugh, I hate myself lol.

Thanks for your help!
 
Last edited:

Solo712

Registered User.
Local time
Today, 18:46
Joined
Oct 19, 2012
Messages
828
Yes, I do still have the old backend data with the autonumbers preserved. It has records through the end of 11/14. The new backend started being used on 11/15 to the present. I compared the autonumbers from 11/14 to the autonumbers on 11/15 and there are 831 records with the same autonumber. 117 of those should appear on SQ Junction, and 279 should appear on Checklist Junction. So 396 of them need updated autonumbers, and the rest of that 831 don't matter, they can have any autonumber. At this point I am just totally confused. Your steps sound good, but #2 is a little over my head. Can you give me more details on how to do that? I don't have the option yet of attaching anything here to show you my tables. Thank you!

Here is a mockup in aircode:
Code:
Dim db as DAO.Database, rsold as DAO.Recordset, rsnew as DAO.Recordset
Dim rsSQ as DAO.Recordset, rsChklst as DAO.Recordset

Set db = CurrentDB
Set rsold = db.OpenRecordset ("BE wih old data")
Set rsnew = db.OpenRecordset ("BE wih new data")
Set rsSQ = db.OpenRecordset ("SQTable")
SetrsChklst = db.OpenRecordset ("ChecklistTable")

rsnew.MoveFirst
DO While Not rsnew.EOF
     rsold.AddNew 
         ' transfer all the fields from the new data to old except the ID 
         ' autonumber  (assumed to be rsnew(0))
         rsold(1) = rsnew(1)
         rsold(2) = rsnew(2)
         :
         rsold(x) = rsnew(x) 
     rsold.Update
     ' this will point back to the record just inserted
     rsold.Bookmark = rsold.LastModified
     ' the rsold.ID (or whatever the autoincrement name) will now hold  
     ' a valid consolidated PK 

     ' check whether the "rsnew" record appears in the SQ junction 
     ' If it does in either or both swing the the FKey to the consolidated ID
     '
      rsSQ.FindFirst "FKey=" & rsnew.ID
      If Not rsSQ.NoMatch Then 
            rsSQ.Edit
            rsSQ!Fkey = rsold!ID          
            rsSQ.Update
      End if  

     ' Do the same test and update with Checklist Junction table

     rsnew.MoveNext
LOOP

This is best I can do if I don't have the actual names of the fields or the Junction table update criteria. Hopefully, you can understand the process better.

Best,
Jiri
 

Jupie23

Registered User.
Local time
Today, 17:46
Joined
Nov 9, 2017
Messages
90
Thank you so much! I will make backup copies and try that tonight after every one is out, so all of today is in there.
 

Pat Hartman

Super Moderator
Staff member
Local time
Today, 18:46
Joined
Feb 19, 2002
Messages
43,257
Solo, There are two problems. Look at #8 again.

1. Julie has two tables with new data. It isn't assigning new autonumbers that is the problem It is identifying the records in one table that have IDs that conflict with the other. Assume there are no existing gaps. db1.table1 has 1-1000 when it is copied. The users work two days and add 150 records 1001 - 1150. db2.table1 replaced db1 so the autonumber seed went back to 1001 and 30 records were added. So 1001-1030 in both tables are actually for different records

2. the second problem is worse and unless the application uses changed by and changed date fields on each record, this one is never going to be resolved. There will be no way to identify changed records unless a row by row, column by column comparison is run.

So, it will be possible but tedious to identify the overlapping new records. It will be much harder to identify the updated records and determine which update is current. In the first database the address field might have been updated for cust #78 and in the second database the phone field might have been updated for the same customer.

There is a tool out there that does this comparison for SQL Server and it might actuallydo it for ACE as well - Try SQL Examiner Suite

Air code is helpful but obviously untested. The longer this database is unreconcilled, the worse the problem becomes because the more rows get changed and the more IDs get added.
 

Solo712

Registered User.
Local time
Today, 18:46
Joined
Oct 19, 2012
Messages
828
Solo, There are two problems. Look at #8 again.

1. Julie has two tables with new data. It isn't assigning new autonumbers that is the problem It is identifying the records in one table that have IDs that conflict with the other. Assume there are no existing gaps. db1.table1 has 1-1000 when it is copied. The users work two days and add 150 records 1001 - 1150. db2.table1 replaced db1 so the autonumber seed went back to 1001 and 30 records were added. So 1001-1030 in both tables are actually for different records

Pat, you completely lost me. What I have proposed to do is to add the db2.table1 entries to db1.table1 (which I understand is whole). What the duplicate numbers in db2.table1 are is interesting only insofar as they were in some cases references for junction tables SQ and Checklists. So, ok, you add the records of db2.table1 to db1.table1, and then use the new autonumber for the records in the junction tables. You saw the code. The loop has access to both, the ID of db1 and the ID of db2. There is nothing more to that that I can see (though admittedly, since Jupie is not giving us the works there may be something.) BTW, I have done something nearly identical some time back. While repairing a table I created a new one and let the users populate the production db with new numbers. Then I simply merged the tables.

2. the second problem is worse and unless the application uses changed by and changed date fields on each record, this one is never going to be resolved. There will be no way to identify changed records unless a row by row, column by column comparison is run.

I can't comment on this because I don't have the info to make this kind of assessment. I understand the issue to be simply duplicate PKs in the main table (and the mess they created in dependent references). That issue is definitely solvable by the method I showed.

Best,
Jiri

So, it will be possible but tedious to identify the overlapping new records. It will be much harder to identify the updated records and determine which update is current. In the first database the address field might have been updated for cust #78 and in the second database the phone field might have been updated for the same customer.

There is a tool out there that does this comparison for SQL Server and it might actuallydo it for ACE as well - Try SQL Examiner Suite

Air code is helpful but obviously untested. The longer this database is unreconcilled, the worse the problem becomes because the more rows get changed and the more IDs get added.
 

Jupie23

Registered User.
Local time
Today, 17:46
Joined
Nov 9, 2017
Messages
90
Thank you both for taking the time to answer. It appears I have figured out my problem! I believe my original manual fix did work, but there was another reason it wouldn't let me set up referential integrity. For my original backend, I made another copy to function as an archive for old records and appended all the records from September. When I did that, I removed those records from the main table, but forgot to delete them from my junction tables, because RI wasn't set up. :eek: Can I blame this on having a newborn baby? Anyway, thank you for trying to help and explaining things so well. I appreciate it!
 

Solo712

Registered User.
Local time
Today, 18:46
Joined
Oct 19, 2012
Messages
828
Thank you both for taking the time to answer. It appears I have figured out my problem! I believe my original manual fix did work, but there was another reason it wouldn't let me set up referential integrity. For my original backend, I made another copy to function as an archive for old records and appended all the records from September. When I did that, I removed those records from the main table, but forgot to delete them from my junction tables, because RI wasn't set up. :eek: Can I blame this on having a newborn baby? Anyway, thank you for trying to help and explaining things so well. I appreciate it!

Well, I am glad you worked it out. Incidentally, the code I gave ommitted the important test on the junction tables I wrote about previously. Doesn't matter now. You got it fixed. And I hope baby is doing fine.

Best,
Jiri
 

Pat Hartman

Super Moderator
Staff member
Local time
Today, 18:46
Joined
Feb 19, 2002
Messages
43,257
I'm glad you got this worked out. I wasn't so lucky. Our DBA accidentally restored a table last year to a version that was two weeks old and it was two weeks before the users noticed the missing data and I investigated. I had asked him to restore the table to my test database - which he did. The problem was, he restored it to production first, noticed his error and then restored it to test. If he had come clean, it would have been a couple hours of work and we would have been able to fix it up that day. The users would still have the paperwork on their desks and we wouldn't have lost anything. But he never reported his mistake. When I got involved because the users were complaining about missing data, he went into deny mode. Deny, Deny, Deny. Then he switch to blame mode and blamed me. That didn't work because my security wasn't sufficient to do a restore which is why I asked them to do it to begin with. End of story - he got fired. Not for making a mistake, that would have been forgiven. He got fired for covering up and causing a huge problem that I was still having to deal with a year later.
 

Jupie23

Registered User.
Local time
Today, 17:46
Joined
Nov 9, 2017
Messages
90
Oh man, that is terrible! I am terrified of losing data like that. This database has about 1,200 records added a day. Is there a way to have it automatically back up once an hour automatically without using any outside software?
 

Solo712

Registered User.
Local time
Today, 18:46
Joined
Oct 19, 2012
Messages
828
Oh man, that is terrible! I am terrified of losing data like that. This database has about 1,200 records added a day. Is there a way to have it automatically back up once an hour automatically without using any outside software?

Here is what I used recently at a client:
Code:
Private Sub AutoBackup()
   'If the backup flag is set, get out
   If Nz(DLookup("zBUflag", "tblZ", "zID=1")) Then Exit Sub
   '
   Dim BUTime As Date, srcepath As String, destpath As String, i As Long
   Dim OTime As String, NTime As String
   On Error GoTo Err_Backup
   OTime = Nz(DLookup("zNextBU", "tblZ", "zID=1"))
   BUTime = CDate(OTime)
   ' Are we there yet ?
   If BUTime > Now Then Exit Sub
   '
   'throw flag to disable second backup
   CurrentDb.Execute "UPDATE tblZ SET zBUflag = True"
   '
   srcepath = Application.CurrentProject.path & "\" & Application.CurrentProject.Name
   destpath = Application.CurrentProject.path & "\Builds\" & Format(Now, "YYYY-MM-DD HHMMSS") & "_" & Application.CurrentProject.Name
   '
   FileCopy srcepath, destpath
   '
ABU_exit:
   i = Nz(DLookup("zBUInterval", "tblZ", "zID=1"))
   If i = 0 Then i = 120
   'set date/time for next backup
   NTime = AddMinutes(Now, i)
   CurrentDb.Execute "UPDATE tblZ SET zNextBU = '" & NTime & "'"
   'clear flag to allow backups
   CurrentDb.Execute "UPDATE tblZ SET zBUflag = False"
   Exit Sub
Err_Backup:
   ToErrLog True, "frmMain - AutoBackup()"
   ToErrLog False, Err.Number & "-" & Err.Description
   ToErrLog False, "Automatic Backup failed!"
   On Error GoTo 0
   Resume ABU_exit
End Sub

You keep the Backup Interval (in minutes) and Next Backup Time in a parameter table and then run this proc off a timer every minute or so. This was not a server application; the code could be run off any workstation. (That is why a flag was thrown to prevent two workstations going at it at the same time).

Best,
Jiri
 

Users who are viewing this thread

Top Bottom