I could reduce the number of rows necessary from 313,000 to 10,793 if I make the main table hold 83 fields.
The problem might then become an issue of record size. How big would a single record be if you did that reduction to 83 fields. If your solution was to make the table have more fields and fewer rows, but the result was still the same number of characters, you probably won't be helping yourself.
Simple math says if you have 300k rows x 7 per month, that is 2.1 million rows/month or just over 25 million rows per year. That's a lot of data for Access. Presuming some overhead for search keys and the like, I'd be surprised if you could shoe-horn more than about 500-600 bytes per record into a year's data before it would go POOF on you.
That is, presuming that you want to keep the data purely in Access rather than one of the SQL solutions (SQL Server, My SQL, ORACLE to name a few), you probably would not take long to smash past the size barriers on Access databases.
The question that comes to mind immediately is whether these subsets that were mentioned earlier ever have to directly interact. If they do, you probably would overflow Access capacity all too soon. This is why I mentioned other back ends.
If they do not significantly interact from one spreadsheet run to the next, you can use a Front-End, Back-End, 2nd Back-End situation (perhaps) in which each set gets imported into an Access back end that you could copy from a master template (presuming that the structures are always the same...)
It would work kind of like this (IF AND ONLY IF the datasets are separable).
1. Create a database file with your correct - but empty - table structure.
2. When you are going to start this import process, use a File System Object to copy the template to a file with another name.
3. Have some linked tables dangling (and not referenced) in your main DB. Use VBA to dynamically link to the copy of the template you just made.
4. Do your import through the linked tables.
5. You can now use queries (that you could not touch before this point) to extract what you want.
6. When you are done with your big, honking back end, disconnect the linked tables.
At this point, you have a separate table in a separet file for each iteration of this spreadsheet exercise and will avoid the issues of overflowing the maximum size of an Access database (about 2 Gb). Further, because the data can be referenced as a BE file, you can have other databases that you dynamically link to this big pile and do your data reduction that way, yet have a perfectly good way to store the older data in logical units.
Further, you can have up to 16 databases open at once in Access. One has to be the FE file, but you could have one year's worth of data in 12 files holding a month's worth at a time. The table names might be an issue, but you could probably generate some UNION queries to remerge the data from the monthly tables.
As to performance, if you don't significantly reduce the amount of data you store, whether in tall/thin or short/fat layouts, it really won't make THAT much of a difference. The time saved processing fewer but wider rows when compared to more but narrower rows is a tough call, and we can't forget that VBA is interpretive, not complied to true machine code.
The biggest table I've got is 600k rows, with probably in the 30k to 50k rows per month, but also with archiving and removal of old data. Performance isn't that bad, but remember that if you have a lot of cross-month processing, 1.2 million of anything takes a long time to process.