What was/is your "largest" access project

When we migrated the Data to SQL Server, the Access BE was 500mb. That would be the average for all 80 Branches. I think all total was around 50gb. When I retired in 2010, there was over 600gb.

Very interesting experience
What mean exactly 'Access BE was 500mb' ?
If you were using Sql Server isn't that the backend?
 
Hope cancel culture don't loose their mind over this statement but

I always thought that @Pat Hartman was a guy 🤯🤯🤯.

Ma'am you are very versed and know your stuff. Hats off to you 🙌
 
@raziel3 Good thing I'm not one of those people:) Thanks for noticing.
 
You all have some pretty impressive resumes. To all the members, is Access your go-to solution for database user interface and/or management?
I think nothing can beat MS Access connected to an SQL Server. I've used a couple of accounting, inventory and payroll software but nothing can beat MS Access. It's so customizable to the thousands of different data analytics management usually wants, that no other out-of-the-box solution can provide.

And now that Excel comes with Power Query/M Code, reporting just got 1000 times more beautiful.
 
I'm a consultant so my clients have already decided that Access is the platform they intend to use or they wouldn't have hired me. Employees will have a different point of view. I have a half dozen small (<150 employees) clients. Most of them run an accounting package and some even use a larger ERP but everything else is done with Access. Most ERP's are like your "smart" phones. They have a lot of features, none of which work well and most are pretty rigid so Access is used for the surrounding apps. So, I frequently end up creating multiple applications depending on what the current need is. A lot of them get data from and/or send data to the ERP. Because the clients can't afford full IT support, my Access apps always use the same BE RDBMS as the ERP so I've used some interesting RDBMS:) The great thing about Access is that from the perspective of Access, it doesn't make any difference if the BE is SQL Server, DB2, Postgre, or Oracle, etc. unless of course, you're using pass through queries and stored procedures. This is one of the reasons, I stick with querydefs for virtually everything.
 
I use Sage as my accounting software but NEVER do data entry in it. I have all the employees use Access or Excel (depending on their experience) to do Data Entry then run some queries to filter out errors and trust me you will get errors when people who are not too familiar with spreadsheets are doing data entry. Then I take the corrected data and upload it into Sage.

When it's time for Audit, my Sage is clean, no Voids, no Gaps, no Errors.
 
Last edited:
Very interesting experience
What mean exactly 'Access BE was 500mb' ?
If you were using Sql Server isn't that the backend?
Sorry for the delay in my response...

The ERP started out with an Access BE (Back-end) Database. We had an FE (Front-end) User Interface that was distributed to each User's Computer. The FE was linked to the Access BE that was located on a dedicated computer. PS. I never used Mapped Drives in the Table Links, always used UNC Method as it was more reliable.

We migrated the Access BE to SQL Server when it was 500 mb (megabytes). The Access BE even at 500 MB was very manageable. It was only after more than 10 employees connecting to it did we began to have issues. Once migrated to SQL Server, we no longer had that issue. And then it was expanded to the many.
 
I have been slowy growing a custom CRM at work which now has over 200 tables, over 300 forms across all the different apps I have made.

I use custom ribbons and have some code I am very proud of to do tasks for us. My best one of late is product BOMs which works out cost price, labour, price breaks, margins and I can export this into our 3rd party (CIM50) manufacturing software.

The main use of it is for customer quotes which has grown significantly since I picked up an old Access '95 manual when I was bored one day back in 2003.

We have 15+ concurrent users. It ties into Sage Accounts and CIM50. I would be more than happy for anyone who does Access as a main job to have a look at it to see where I have gone wrong to make it work.

~Matt
 
200 tables sounds like a lot. Do you have PKs and relationships defined and RI enforced? If you have a problem with the app, you need to be specific regarding where you need the help.
 
200 tables sounds like a lot. Do you have PKs and relationships defined and RI enforced? If you have a problem with the app, you need to be specific regarding where you need the help.
Hi Pat,

There are no issue with my database. I was only asking if anyone would like to see it to offer me ways to improve it further.

~Matt
 
Last edited:
Well the difference is that if you use Access you can do whatever you want. If you use a third party product you can only do whatever they allow you to do within the interface

In point of fact Access is so bloody powerful that a lot of code we write is explicitly there to prevent a user doing whatever they want, including the developer. In Access, (or any other database system actually) it's easily possible to completely trash your own data with an injudicious command, (such as update a field to a new value without specifying a particular row), and there's no undo.

So the developer provides a set of facilities that are likely to be required. If clients have new requirements, then the developer provides additional facilities, or deals with their requirements as a one-off process.

@MattBaldry. I have to say 200 tables sounds a lot for only 300 forms. My biggest projects have around 100 tables, but thousands of queries, hundreds of forms and reports, and a menu system with hundreds of choices.
 
My largest database has 4600 forms, 3200 reports, and 4100 tables. A few tables have between 100 and 200 fields. Most of the forms and reports have a custom graphic and many of them have text-to-speech. It is a little over 2 Gb. It doesn't have a lot of data. And there is very little code.
 
My largest database has 4600 forms, 3200 reports, and 4100 tables. A few tables have between 100 and 200 fields. Most of the forms and reports have a custom graphic and many of them have text-to-speech. It is a little over 2 Gb. It doesn't have a lot of data. And there is very little code.

Having that many forms and reports is not necessarily out of the question, but having 4100 tables - and having that many fields in some - usually is a flare-lit signal that something might be a bit off. Since you said it is over 2 Gb, you obvious have one of two cases: (a) an independent SQL engine as the backend or (b) more than one native Access backend, with disjoint table spaces.
 
No sir.
I don't touch SQL. I have one Access file that is not split. I frequently run C&R. I am not sure why the database has not crashed. Perhaps 2 Gb is not real. I have been teetering on it for years. I back up the application every few days, just in case.
 
No sir.
I don't touch SQL. I have one Access file that is not split. I frequently run C&R. I am not sure why the database has not crashed. Perhaps 2 Gb is not real. I have been teetering on it for years. I back up the application every few days, just in case.
The regular backup is wise. The question of file size actually depends on how you do the mapping AND how you do the size estimates. Windows is not always consistent when reporting file sizes because the units make a difference. If that is a split front-end/back-end case and the TOTAL is bigger than 2 Gb but neither component is that big, then you could make it work. The problem is that the first time you actually touch the REAL 2Gb limit is the day that a Compact & Repair stops working.

Given the size and complexity of your project, I sincerely hope you have studied normalization because while I cannot accuse you of anything at all, I will merely say that the sizes you quoted for a native Access database are suspicious.
 
The regular backup is wise. The question of file size actually depends on how you do the mapping AND how you do the size estimates. Windows is not always consistent when reporting file sizes because the units make a difference. If that is a split front-end/back-end case and the TOTAL is bigger than 2 Gb but neither component is that big, then you could make it work. The problem is that the first time you actually touch the REAL 2Gb limit is the day that a Compact & Repair stops working.

Given the size and complexity of your project, I sincerely hope you have studied normalization because while I cannot accuse you of anything at all, I will merely say that the sizes you quoted for a native Access database are suspicious.
The CR will actually stop working way before the 2gb mark, probably closer to 1.85 it will choke.

As for 4100 tables, likely a lot of temporary or analytical tables that need to be removed. In the Manufacturing ERP that I developed, which was highly normalized, had just shy of 100 tables. It did everything from automated inventory, all cost methods, to sales, as well as connected to AutoCAD and MasterCam, to annual reporting and budgeting. It also facilitated over 80 branch plants. Back in the 90’s before we got our WAN I would merge all 80 into 1 at month end. Then we placed the data on SQL Server 97. But before the migration to SQL server, the data can only be in a database by itself on. No forms, etc and I know., lot backend. Migrating to SQL server was a huge relief.

Very few ever get to this point in a legitimate application. In most case where I have to recover client databases it’s because of total mismanagement of tables. Data would be imported from flat files that were not normalized, and the date itself was never normalized in a lot of temporary tables that were not deleted or cleaned up after a process. Even if you develop and ETL process that may require several tables, but you need to maintain a structure. The data at least should be deleted from them when a process completes.
 
My largest database has 4600 forms, 3200 reports, and 4100 tables. A few tables have between 100 and 200 fields. Most of the forms and reports have a custom graphic and many of them have text-to-speech. It is a little over 2 Gb. It doesn't have a lot of data. And there is very little code.
That's impressive. It also must be a nightmare to maintain.

I would like to know if it runs on low-end computers with ease. If not, then maybe separate it across multiple specialized frontends and backends. See if you have repeating data models, those should be normalized. If you have many forms that look the same and the only thing that changes is the graphic, then make your graphics (is it charts?) dynamically appear according to the form's purpose. I can't imagine creating 4600 different forms.

BTW, what do you mean they have text-to-speech? your users need to listen to the app?
 
I actually had a text-to-speech option in my big Navy database. Took me ONE VERSION to make the voice optional. (By popular demand, with torches and pitchforks assembling near my desk.)
 
My largest database has 4600 forms, 3200 reports, and 4100 tables.
There is something seriously wrong with this picture. Maybe you should take a look at how to normalize the tables. Then using queries with arguments will cut down on the forms/reports significantly.

I'm guessing that this is a single user application. Even still, it should be split, sooner rather than later before you cross the line and lose the database. Once it is split, you can start working on normalization
 

Users who are viewing this thread

Back
Top Bottom