I've been working on a project involving random events and calculating time intervals to millisecond accuracy
So I decided to use both ideas in a simple application for measuring reaction times. I thought it might be fun for others to play with
The idea is to respond to an event that occurs after a random delay & in a random position on the screen.
It uses the system clock to determine times in milliseconds.
There are 3 levels of difficulty: easy/moderate/hard
Typically I can manage about 0.7s on the easy level - see if you can do better!
The other 2 levels should take longer : in the range 2-12 seconds?
The app opens in a floating window on the desktop with nav pane/ribbon hidden.
To view the code, use the SHIFT key as you open the app.
The app works in both 32-bit & 64-bit Access and can also be run on a tablet
NOTE: system times are updated about 60 times per second so the quoted accuracy is about 16 milliseconds.
Good enough for this example in my view.
There are other ways of measuring time with precision
For example, you can use GetTickCount but that is also based on system time
The built in Timer function can be used to give time to centisecond accuracy though with the same limitation
For mission critical time differences, you could use the StopClock class.
I believe that is accurate to nanoseconds though I've not tested it
Any feedback, please send me a PM or email me using the link below