Hello. In this brief article I will tell you how to make a timer with an acceptable reading accuracy in your application on the Corona SDK. Also consider what are the problems of the standard timer.
The article will solve the following problem: Make an application that will show the elapsed time since the application was turned on with its update 1 time per second.
1. Obvious standard solution.
Create a standard timer and display its ticks.
timeout = 1000 timer.performWithDelay( timeout,function(event) print(event.count) end,0)
Everything seems to be obvious, but there are features in the work of this solution:
')
- Firstly, the standard timer is able to correctly and accurately work out timeouts starting from 300-400ms, and everything below this value starts to lag very significantly, but our case is different since the timeout is more than the specified limits. The minimum possible period for timer ticks (if you specify 1 ms) is 1 / fps, i.e. 16. (6) ms for 60 frames or 33. (3) ms for 30 frames.
- Secondly, even at the specified relatively accurate period, there is a random error of about 5-10 ms from each tick, i.e. in an hour the error of 15-30 seconds accumulates. This problem can be partially solved by subtracting 5 ms from the value specified in the parameter when creating the timer, i.e. instead of 1000, indicate 995.
- Thirdly, if in your application there will be small friezes caused by the result of the work of another code or unstable operation of the device, these friezes will also be added to the piggy bank of the time lag in the timer.
- Fourthly, if you minimize the application for a while and then re-deploy it, all the time that the application was minimized will be excluded from the number of seconds the application is running.
2. Good decision.
To solve the problems of the past method, we use the following construction, in this method we use a timer with the maximum possible speed, but the signal for the tick timer will be a calculation based on the exact time source of the system.
local timeout = 1000
We analyze the features of this method. Despite the fact that we indicate the frequency of ticks 1ms as described above, the real quanta of ticks will be performed every 16 (33) -50ms, and this will determine the maximum possible error of the above method, the error will vary in the range of 0..50ms from tick to tick, i.e. the frequency of following ticks will be less stable than in the first method, but the magnitude of this error at any distance (even length in years) will be the same, i.e. even in a year, our next tick will have an error within the same limits relative to the very first tick.
3. Verification of results
I will give an example of how you can verify the justice of all the above. The given source will, once per second, display the current time elapsed since the application was turned on for two timers (separately) and show the error accumulated during operation.
local timeout = 1000 socket = require "socket" local start_time = socket.gettime()*1000
Good luck to all!