Database efficiency with generating tables.
I have been working on product database since the start of the year. I had never used MS Access before, and I have leant a great deal. But this is where I begin to doubt my abilities, as I have taught myself mostly everything I know.
I am just wondering will I the end user see noticable lag in the database when it is at capacity, if I regenerate many tables on some functions? I am seeing this being an issue when cascading updates occur. For instance, at the far end of the scale; if I update the currency exchange rates, this will regenerate the cost of raw materials in the products, which will update the all the product total costs, which will update all the prices for all the products. There is a mix of completely regenerating tables and just updating specific parts of tables. But if in the worst case scenario, all tables were regenerated due to a small change, would the end user experience any noticeable lag?
There are upto 5000 records in some of the tables, and upto 5/6 fields in each. At what sort of capacity or workload does access appear to lag? I'm just thinking back to high school, when some programmes would take forever to update or perform a certain task (or even photoshop on low end systems these days, haha), and really want to avoid this for the end user.