Alternative to TOP, LAST, MAX in SQL query (1 Viewer)

dageci

New member
Local time
Today, 06:47
Joined
Mar 13, 2018
Messages
7
Hello guys,

I'm creating a desktop app in Access to retrieve a lot of data (millions) from Azure cloud using Rest API.

I know that linking to SQL server would be the solution. But wanted to know if there is a way to speed up the query for getting the last record in Access table.

TOP, LAST and MAX are a little slow when the number of records get big.

I'm attaching the app, it's a small app to retrieve the telemetry data from Microsoft Azure Application Insights using Rest API into Access local desktop app. It is a part of my project VBA Telemetry (VBATelemetry.com) that enables us to monitor, log and track events, errors, metrics from VBA projects (Access, Excel) in real-time.

Davor
 

Attachments

  • TrackingAppVBATelemetry1_Empty.accdb
    612 KB · Views: 109

isladogs

MVP / VIP
Local time
Today, 05:47
Joined
Jan 14, 2017
Messages
18,209
Hello and welcome to AWF

I've just testing this (using the last 30 days) and as you say its very slow.

If you only want the latest record you could easily filter for the last day or a smaller time range in the last day.
Looking at the results, these are throughout the 24 hour period so you could just filter for the last hour or less
BTW some of the second Timestamp field only have dates - no time

I see you are using Tim Hall's JSON converter code
Have you looked at my JSON Alalyse & Transform for Access application (JATFA) which is available from http://www.mendipdatasystems.co.uk/json-analyse-transform/4594138310
A free cut-down evaluation version is also available
More info in this thread ....https://www.access-programmers.co.uk/forums/showthread.php?t=295789

This is a commercial application but a free 'cut-down' evaluation version is also available
Currently it makes limited use of Tim Hall's code but this can be slow
I'm currently working on an update which has several enhancements (including removing Tim's code!)
 
Last edited:

gemma-the-husky

Super Moderator
Staff member
Local time
Today, 05:47
Joined
Sep 12, 2006
Messages
15,634
Can you not index the data in some way?

The concept of a last record is probably the issue. What defines last? Last added, or last for some other reason?
 

dageci

New member
Local time
Today, 06:47
Joined
Mar 13, 2018
Messages
7
Thanks guys for responding.

I'm not thinking to only get one record. But all the records, because the Azure Application Insights retention time is 90 days (after 90 days the data is removed).
So because of this retention and also for users to be able to create their own reports and queries I'm thinking to create this download app to allow them to download this data.

The SQL server linking and stored procedures would be a solution, but for users that don't have soo much data or they don't know how to create a SQL server or don't want to I'm thinking to create this little app to use the Access tables.
But with access tables when there are 100.000+ records the SQL TOP is very slow and we need this TOP to determine what data we already have in our database.

I can't use a function to write the last record somewhere and to go from this because the data that is retrieved from Azure is not always the last data. Sometimes, because the telemetry data is sent in chunks and intervals from client machines the data that is coming is older than the one that is already in the Azure.

Colin:
Thanks for the JSON tip and your link, I will take a look.

Dave:
There is an index on timestamp field and when there is a lot of data in the table it is slow.

Or maybe the only solution is to go with SQL server and linking to access. Maybe someone has made a routine to do this all programmatically.
 

isladogs

MVP / VIP
Local time
Today, 05:47
Joined
Jan 14, 2017
Messages
18,209
I think I can help you significantly speed up the processing of data with a better parsing routine.
However that doesn't assist with your immediate question

Some observations:
a) TimestampDT is already indexed which helps with query speed
b) Times within 1 second of midnight are converted to datetime format losing the time in seconds - seems like its rounding down
c) I'm sure you've already tried these queries but for info:
Using a dataset for the last 30 days - approx 900k records, this query took about 7 seconds to run & got 44 records (same times)

Code:
SELECT TOP 1 pageViews.*
FROM pageViews
ORDER BY pageViews.timestampDT DESC;

This took a fraction of a second & got 1 record only (but from what you said it may not be the latest timestamp value)

Code:
SELECT TOP 1 pageViews.*
FROM pageViews
ORDER BY pageViews.ID DESC;
 

dageci

New member
Local time
Today, 06:47
Joined
Mar 13, 2018
Messages
7
Hm,
because I'm looping through every value before adding the record to a table, maybe I could create a long or double field and convert and store this timestamp data into milliseconds value, what do you think?
This would be a very big number, would it speed up or is this not a good approach?
Hm, because when I convert the timestamp to a date/time field I lose the milliseconds, so normal converting to datetime and then to number I think is not the solution
 

isladogs

MVP / VIP
Local time
Today, 05:47
Joined
Jan 14, 2017
Messages
18,209
Hm,
because I'm looping through every value before adding the record to a table, maybe I could create a long or double field and convert and store this timestamp data into milliseconds value, what do you think?
This would be a very big number, would it speed up or is this not a good approach?
Hm, because when I convert the timestamp to a date/time field I lose the milliseconds, so normal converting to datetime and then to number I think is not the solution

Having tried this once, it seems to loop more than once through the data
I don't understand why you do that
Suggest you download the file & save as a JSON file then process that importing each record in turn - no loops
It should be easy enough to convert the timestamp value on the fly as you do so.
This is the approach I use in my app

A double number datatype will handle that without issues

It is possible to get times in milliseconds in Access. For example this shows the time to load some of the larger JSON files in my JATFA app



My reaction times example app also uses the same idea to measure times in milliseconds albeit with a precision of about 16 milliseconds due to system time limitations. Click on the link below to find out more
 

Attachments

  • Capture.PNG
    Capture.PNG
    34.5 KB · Views: 881

dageci

New member
Local time
Today, 06:47
Joined
Mar 13, 2018
Messages
7
Looping is because I'm going the data day by day, because if the user wants to get all the data (90 days) then this would be a huge file. 90 days for this sample is around 3 million of records, and in real situations, there would be a lot of more data.
Do you think that I should go with all at once, I didn't have something like this in past? Will the file not be too large?
And how to make some progress showing while downloading the file, now the app freezes.
 

isladogs

MVP / VIP
Local time
Today, 05:47
Joined
Jan 14, 2017
Messages
18,209
EDIT - posts crossed but the below seems more relevant after your response

Just had another quick look at this.
I added a short section of code to GetDataForPageViews to allow me to save the json output as a file 'Telemetry.json'

Code:
...

   Response = httpReq.ResponseText
    JSONText = Response

'==========================================
    'CR - save the json output to file
    
    Dim stm As Object
    Dim strFilePath As String
    strFilePath = CurrentProject.Path & "\Telemetry.json"
    
    'save to file
    Set stm = CreateObject("ADODB.Stream")
    stm.Type = 2  ' acTypeText
    stm.Open
    
    stm.WriteText httpReq.ResponseText
    stm.SaveToFile strFilePath, 2   'adSaveCreateOverWrite
        
    '=============================================
    
    Set Parsed = JsonConverter.ParseJson(JSONText)
   .....

The file is inevitably large - approx 21MB for 5 days but only slightly larger for 90 days!

I can read it into my JATFA app very quickly ....below is the 5 day version - it took just over a second!



However, I haven't tried parsing it & saving the data.
I may do so later if you have no objection to me using this
Perhaps continue this discussion via PM or email??
 

Attachments

  • TelemetryJSON.jpg
    TelemetryJSON.jpg
    104.8 KB · Views: 699

isladogs

MVP / VIP
Local time
Today, 05:47
Joined
Jan 14, 2017
Messages
18,209
Just read in the 90 day version - first record on 13 Dec 2017
File size 21.92MB, total characters = 224,446,472
It took slightly less than 1.5 seconds

The reason its so fast is because I no longer use Tim Hall's code for viewing files.
Although it has many strengths, it struggles with anything bigger than a few MB.
It would have frozen if I'd tried to use it to view this file

It also can't cope with certain file structures such as those used in Google Maps

I still use that code for a small but important part of my app but its causing a bottleneck for certain files.
For that reason, I'm planning to revise those parts of my app to also do without any use of Tim's code if at all possible
 

Attachments

  • TelemtryJSON_90days.jpg
    TelemtryJSON_90days.jpg
    105.2 KB · Views: 116

dageci

New member
Local time
Today, 06:47
Joined
Mar 13, 2018
Messages
7
Colin, sure you can use it.
And of course, we can continue this discussion via email.
My email is davor.geci AT gmail.com
Thanks
 

Users who are viewing this thread

Top Bottom