Choices ... (1 Viewer)

Martyh

Registered User.
Local time
Today, 03:37
Joined
May 2, 2000
Messages
196
Hi Guys,

Some Background: I've got a front-end and back-end db -- the communication is extremely poor between them but that can't be changed! Therefore I've gone from the db to several Arrays for most of UI. There is one giant outburst of db activity at the outset of the front-end and then calm... and there are only a few times that there is some necessary communication.

Now, I come to the issues:
1) Do I communicate the changes to a temp table located in the FE ... and then while the person is not active, ( 2) how will this be defined?) then upload the data from the temp table to the appropriate tables.
or
3) Do I make the changes to the appropriate tables right away? There is some iteration involved so that there will be a number of calls to the BE DB (on average about 5-10)
4) I can gather the iterations together and perform them with just one call - in both cases, but do I need to -- is it faster?

I would appreciate your comments!

Cheers,

Marty
 

NauticalGent

Ignore List Poster Boy
Local time
Today, 03:37
Joined
Apr 27, 2015
Messages
6,396
I would look into Table/Data Macros if your version of Access (AC2010 and newer) supports it.
 

The_Doc_Man

Immoderate Moderator
Staff member
Local time
Today, 02:37
Joined
Feb 28, 2001
Messages
27,317
I've got a front-end and back-end db -- the communication is extremely poor between them but that can't be changed!

OK, the first question in all of this is, how poor is extremely poor? Because unless you have semi-predictable windows of good communication, NONE of your other questions matter. If your comm link is randomly bad then ANY update scheme you choose suffers the same potential fate - a random link drop leading to a corrupted BE file. The first issue is therefore to consider whether there is any update method that MIGHT repeat MIGHT be less susceptible to poor comm links.

IF there is a way you can make a "transaction file" that contains EVERYTHING you want to send in a linear file (or a couple of files, doesn't matter), then you can build your files and trigger an FTP session to send the files. (See also SHELL as a way to trigger external files such as batch files.) At the other end, you would have something to look for transactions, perhaps based solely on date. There ARE ways for you to trigger your local FTP client to give you a listing of the files in the remote folder. You can transmit a file with a prefix of some sort and the date/time expressed in yyyymmddhhnnss format as a suffix. Transmit it with a given file type that has no meaning to Windows. Look at the directory listing to see that the files arrived and have apparently correct sizes. Then if all looks good, FTP can RENAME a remote file. So what you could do is send the file and then get a selective directory listing. If you see the files you wanted to see and their sizes make sense, you rename them to some OTHER file type that ALSO has no meaning to Windows. Now you set up a "listener" Access app that also links to your back end. Using File System Objects, you can see the files that are in the folder that have the right names. Process the transaction files and then either rename them to some sort of archive or delete them.

Why, you ask, do I suggest FTP as an intermediary? Because Access uses a protocol that does not like to be interrupted and it will crash when interruptions occur. When it does, corruption becomes possible verging on likely. SMB protocol is not overly forgiving. So if you can avoid using SMB for your remote, poor-link operations, you avoid having Access crash while updating the back end. FTP, being in a child process, doesn't give a Access a hiccup, and an FTP link failure only crashes the batch job. You can have the batch job return errors If you program it to not care, you don't risk crashing Access with a downed network link in mid-sequence.

So you look at the folder contents, compare names and sizes, and for each file that matches, rename the remote file so that the remote Access "listener" can see the file and recognize it as something to be processed. Then a transactional style of doing things might work. I will emphasize that the gyrations with batch scripts and FTP are ONLY necessary if you have abysmally bad network links such that SMB cannot be trusted. If you say "extremely poor" but it is perhaps not as bad as that, you might still wish to transmit transactions in a single file to then be processed later.
 

CJ_London

Super Moderator
Staff member
Local time
Today, 08:37
Joined
Feb 19, 2013
Messages
16,668
A couple of thoughts

the communication is extremely poor between them but that can't be changed!
perhaps it can - take a look at this thread, there may be some improvements you can make

Also, agree with NG - but depends on what you are doing but data macros will transfer processing to the BE

Therefore I've gone from the db to several Arrays for most of UI
might be more efficient and easier to manage to use ADO disconnected recordsets rather than arrays

In either case this can be problematic in a multi user environment as record locking will not occur so additional coding is required to check for changes before updating the actual tables in the BE.

(2) how will this be defined?)
consider having a form timer, perhaps on a hidden form. Every time the user does something, the timer is refreshed. If the user doesn't do anything for a period of time, a warning is displayed on the screen with a countdown message of say 'The BE will be updated in 2 minutes'...1 min etc. When zero is reached, a message is displayed 'BE is being updated', the update occurs and when done a message 'BE updated' displayed. The user can stop the update at any time during the countdown by doing something in your app. Might be typing something, might be opening or closing a form, you decide. But for each of those actions you will need a bit of code to reset the timer.

3) Do I make the changes to the appropriate tables right away?
if multi user and users are working on pretty much the same data then yes

4) I can gather the iterations together and perform them with just one call - in both cases, but do I need to -- is it faster?
depends how you are connecting to the BE. If opening and closing the connection then better to open, do your updates, then close
 

Pat Hartman

Super Moderator
Staff member
Local time
Today, 03:37
Joined
Feb 19, 2002
Messages
43,478
When you disconnect the data collection from the application of that data to the permanent database, you open yourself up for a huge reconciliation problem UNLESS you know for a fact based on your normal workflow that there will never be any conflicts because no two users would ever need to update the same record in the same timeframe.

I agree with Doc, if your connection is unreliable, you cannot update the permanent database using that connection. FTP is the best solution. That technology can survive a poor connection because the receiving end will always be able to tell if the sent file is complete. You send the data to the mother ship and the update happens overnight.

If it is possible for there to be multiple updates to the same record from different sources, you need to code that figures this out and gives a user the opportunity to decide what version of the data should be accepted.
 

Martyh

Registered User.
Local time
Today, 03:37
Joined
May 2, 2000
Messages
196
Thanks for your immediate response ... i will comb thru your suggestions!!

Marty
 

Users who are viewing this thread

Top Bottom