I'm working on an application that screen captures a monitor in real-time, encodes it, sends it over ethernet, decodes it, then displays that monitor in an application.
So I put the decoder application on the same monitor that is being captured. I then open a timer application and put it next to the decoder application. I can then start the timer and see the latency between main instance of the timer and the timer within the application.
What's weird is that if I take a picture of the monitor with a camera, I get one latency measurement (almost always ~100ms) but if I take a Print Screen of the monitor, the latency between the two is much lower (~30-60ms).
Why is that? How does Print Screen work? Why would it result in 40+ ms difference? Which latency measurement should I trust?