I'm guessing that declaring dt globally doesn't work because of something to do with the order of how classes and their properties are initialized under the hood. Probably the same reason a global keyconfig crashes but a handle doesn't.
I can imagine random computer shenanigans causing enough lag to reduce the effective framerate to 50, but I am just as lost as to where the extra frames come from.
If the wait function rounds to the nearest millisecond, or truncates anything past the decimal, then it's going to give different results than we'd expect. However, your first test was close enough that I'm not sure how much that could explain it.
One thing I found when I made the clock class is that there was consistently a millisecond of difference between what was reported by timers and what was sent to wait, even when there were no other sources of lag. Because of this, the tick method tries to compensate, which could account for some of the discrepency.
I copied the code you provided and ran it, and I got similar results with 2-10 extra frames. Thinking it might be a rounding issue, I tried it again at 50fps (dt = 0.02). There were a few huge lag spikes in the log (10.1 to 10.6 seconds of realtime, instead of 10 + a couple centiseconds), presumably because I was reading the previous log at the time to try and get an idea of where the discrepency was coming from. The results never went higher than 502, but sometimes got as low as 495. The low values can be explained by lag. I can account for one of the extra frames due to the tick coming before the log, but I'm less sure about the other.
I should also mention that 502 frames was the most common result, with all the others being 495-499, presumably due to random lag. I should also note that the amount of lag that made it into the realtime results has no correlation with the frame count, since the offending lag has to come in the final frame to show up in the realtime result.
Even if the 1ms compensation and truncating aren't to blame for the high results for 60fps, it's worth noting that there will be rounding errors regardless, because floating point arithmetic is imprecise for most fractions. It still seems odd that this would result in extra frames, rather than fewer.
Simply put, without being able to rely on all players having predictable amounts of lag, there will always be some small discrepencies. Given the same initial state, you'll have the same final state, even if one machine takes a fraction of a second longer to get there. The problem comes when those discrepencies add up, and it becomes necessary to push the players back toward synchronicity. Sending the frame count along with player input might help, but it isn't going to solve the problem by itself.
The discrepencies you're finding in frame count, though, are mostly due to things beyond our control, like floating point limitations and whatever Windows is doing and the state of the hardware, etc. There will be lag over the network, too, which is probably going to be far more troublesome.
I've never had a solid test case for the most common solutions to the lag-induced client discrepency issue, so I can't promise what the best way to deal with it is. If you force the clients to agree after every network event, there's a good chance this will result in jank and other such glitches. If the lag is extreme enough, that might happen anyway (I mean, it happens in mainstream multiplayer games), but can probably be minimized.
[edit] I think I've figured out the other extra frame in my 50fps test. The realtime timer restarts before the loop, but the clock's timer does not, so any delay between the creation of the clock and starting the loop counts toward the first frame. So 502 frames in 10seconds makes sense. But you really shouldn't be getting more than 2 extra frames at any framerate, so I think it's probably mostly floating point imprecision for certain numbers. If it's waiting 16ms instead of 16.6[...], the maximum frame count in 10s should be 627. The effective dt for 610 is 16.3934ms, but since we have established two frames due to the timers doing different things, it's more like 10000/608 = 16.447ms. I'm not sure what causes that, but given how frustrating lag can be when playing, and how few the extra frames are (not even up to 61fps), it's probably an error on the side of convenience, what with not effecting the need to account for lag either way.[/edit]
看過來!
"If you want utopia but reality gives you Lovecraft, you don't give up, you carve your utopia out of the corpses of dead gods."
MaxAngor wrote:
George... Don't do that.