Solved Auto Index

substitute your dcounts with a subquery
This is counterproductive at this point and will dramatically worsen performance.
I had conducted my own experiments in this regard with measurement results like the ones in the picture to generate ranks.
Unterabfrage ... subquery
QueryIncrement ... own method
Dauer ... duration
 

Attachments

  • CreateRank.PNG
    CreateRank.PNG
    37.1 KB · Views: 57
Last edited:
@Space Cowboy:
My personal usual limit for determining ranks is around 10,000 records used. Anything above that leads to less than acceptable performance, because the effort increases exponentially as the number of records increases. With 1.7 million records, you're heading for disaster.

The query in #16 looks logical, but it will not be usable in this way given the amount of data.

My tip: do it completely differently.

1) Which backend are you using? Other database management systems such as SQL Server offer more efficient and powerful query execution than Access itself. If you need performance, let the right machine do the work for you.

2) If you are limited to Access: I would split the tasks, save intermediate results and then merge them, i.e. work with temporary tables.
The main goal would be the best possible data reduction for a reasonably suitable ranking.

3) As already mentioned, ranking using queries is systematically complex and therefore problematic. Although many things in a database can be solved better using queries, VBA also comes into play here. In a recordset, such ranks can also be incremented relatively easily using well-presorted records, in one simple run and quite quickly.
So you could combine queries and VBA measures in a suitable procedure.
 
Last edited:
Good Morning Good People,

Thank you all once again for your considered and helpful replies. I am in the process of having a rethink on the back end data by breaking it down into my own database with a smaller more appropriate data set and and going from there.

One more quick question if I may? When indexing fields in a table, is it a prerequisite that rows in the field are unique? Or do the indexes reference the primary key for that table in some way?
 
When indexing fields in a table, is it a prerequisite that rows in the field are unique?
No.
You can imagine an index as a very simplified index in a thick book. The faster search speeds up comparative operations. Comparative operations include filtering, linking, grouping, sorting and partial aggregation (determining min/max).

A unique index is a special form of index - values can only appear once in a table.

A primary key is a special form of unique index - NULL contents are explicitly not permitted and physical sorting of table data is aligned to the PK (after compression).
 
That was incorrect advice.
What significantly better choice would you have for determining the ranking within an Access query?
The problem here is a correlated subquery in connection with a very large amount of data (systematic problem).
 
I think we're talking about two different things. i'm talking about the dashboard with a large number of individual cells, each of which has at least one domain function.
 
I think we're talking about two different things.
That's how it will be. I was talking about the query in #16. There are only two rankings per row, and a ranking alone will certainly not be practically feasible given the number of records. For small numbers of records, I said < 10000, it is a correct query formulation.
 
Last edited:
@Space Cowboy
Here are some thoughts in summary of your questions:

(1) Your backend is Visual Foxpro, with lots of tables and lots of records. Structures and indexing are a bit unclear. I don't know if and what kind of help you can get from outside. A quick search on the Internet turned up practically nothing useful for me.
I have no control over the source data,
there have been two occasions in the last 10 years when there have been considerations on a redesign. On each occasion it was determined that its too big to change and too big to fail.
I have had a quick look at a couple of tables and I get a warning saying that they are indexed but there is no identifiable index key.
However, this database is probably used and maintained by an essential application.

(2) When processing data via linked tables in Access, you come up against several significant limitations. One of the suggestions made was to carry out the processing using a more powerful database management system. You said in a message that you could get a SQL Server 2008. An alternative would be SQL Server Express, which is free and should have the current range of commands, but has limitations such as a 10GB database size.

(3) To use the power of SQL Server, you need to use T-SQL and get to grips with it. For your current requirements, there are statements such as ROW_NUMBER, RANK, DENSE_RANK, NTILE. In contrast, Access uses Jet-SQL by default, a very simple variant.

(4) Theoretically, you can make external data sources available in SQL Server via a linked server.
I can't say how this works with Visual Foxpro. Maybe direct linking is not possible, but only an export from Visual Foxpro and a subsequent import into the SQL Server.

(5) Given the uncertainty about the actual structures in the Foxpro database, I would be inclined to create my own functional data model in SQL Server with my own tables and known tables and import the data from Foxpro into it.
With your own complete data model, you then need an interface through which the data from one structure is transferred to what is probably another structure.
Such an interface could, for example, consist of one update query and one append query for each target table of the data model. Something like this has worked for me with data models with 15 tables.
Such a data transfer would have to be carried out once (coupled in a transaction).
The question then arises as to what will be done with changed records, new records, new tables(?) in the Foxpro database. These should be able to be imported via the same interface, and it would make sense to limit the transfer to new data only and identify such data in the Foxpro database, e.g. using existing time stamps for creation and modification.

Assuming that you can create this parallel database in SQL Server, you are well positioned for the future. There is a lot of help available for T-SQL and the use of SQL Server together with Access as a front end to solve all kinds of tasks.
There may also be an opportunity to completely replace the Foxpro database in the unknown application with the new database in SQL Server.

That was an outline of a possible strategy. The devil is in the details. If an important company is serious about an important matter, you will probably not be alone.
 
@ebs17
Thanks for that Eberhard,
I have downloaded the server and had a look over it. My goodness it is daunting.
ACCESS is now looking like a comfy old Sofa. :)
 
I know, I had it about 30 years ago or more, when it first came out. It might still be buried in the loft somewhere.
 
So they are not vfp tables?, just foxpro tables?
 
You have to understand that my knowledge is limited, I am learning as I go.
I update the files each day and they are stored DBF, FPT and Some CDX whatever that means.
What I have learned is that it is lightening, 2 million records load 20 times faster than Access can scroll to the bottom of the record table.
 
DBF - Data table
FPT - Memo table
CDX - Index table

Foxpro holds all these tables individually, not all in one file like Access does.
That would be the equivalent of a Visual Foxpro project, but files still exist individually.

Been a long while since I have used Foxpro (now VFP) and only use it now and again for my Music Collection as it is way faster that Access.
Sadly I can barely remember the commands in VFP any more, as I have used Access for so long. :(

So you are now updating these files from within Access?

I would seriously consider getting the data into Access in it's entirety, if possible, but Access has a 2GB limit, so another backend DB type would be required.
 
Give me your old FoxPro. :)

Thanks to all you guys input I have developed a better (?), what seems to me, more efficient strategy. There is a whole mass of data that I am not really interested in for this particular project. I have streamlined the data down to a half dozen tables with the absolute minimum fields necessary and linked to them in a new DB I have managed to get the largest table down to only 1.4 million records. I am very pleased that the size is now 424 KB
 

Users who are viewing this thread

Back
Top Bottom