For rendered stuff, it typically does make for smoother motion, even at rates much higher than the eye can see, because of motion blur.
So, recorded video works fine at relatively low bitrates…but the camera is also set up to record a relatively-long exposure, something like a thirtieth of a second, and you see the scene averaged over that time. Your brain can see motion blur and interpret that usefully, to know that there is motion happening.
Rendered 3D game images typically do not work like that. You see a series of perfectly-sharp images at instants in time. So your brain doesn’t get the nice smooth motion blur to work with.
But if your computer renders and displays the intermediate images, then your eye can work with that nice smooth blur.
It’s probably possible to compute a motion-blur more efficiently than rendering a lot of intermediate frames, get at least some kind of approximation of true motion blur, and some games do that, but brute-force rendering of more frames is simple for s developer and accurate. Plus, any game that can support a high frame rate can do it, even if it doesn’t have some kind of faux motion blur approximation.
I have a 165Hz monitor. When moving my mouse cursor around, I can definitely see independent images of the cursor.
EDIT: That being said, you could probably get a pretty good approximation by rendering and combining multiple frames on the card and only pushing a lower frame rate out to the monitor – that is, you only really need beefy rendering hardware, not a fancy monitor or cable, to get pretty close. I suppose that in theory, a compositor could do that. I don’t know if someone’s already done that or not.
For rendered stuff, it typically does make for smoother motion, even at rates much higher than the eye can see, because of motion blur.
So, recorded video works fine at relatively low bitrates…but the camera is also set up to record a relatively-long exposure, something like a thirtieth of a second, and you see the scene averaged over that time. Your brain can see motion blur and interpret that usefully, to know that there is motion happening.
Rendered 3D game images typically do not work like that. You see a series of perfectly-sharp images at instants in time. So your brain doesn’t get the nice smooth motion blur to work with.
But if your computer renders and displays the intermediate images, then your eye can work with that nice smooth blur.
It’s probably possible to compute a motion-blur more efficiently than rendering a lot of intermediate frames, get at least some kind of approximation of true motion blur, and some games do that, but brute-force rendering of more frames is simple for s developer and accurate. Plus, any game that can support a high frame rate can do it, even if it doesn’t have some kind of faux motion blur approximation.
I have a 165Hz monitor. When moving my mouse cursor around, I can definitely see independent images of the cursor.
EDIT: That being said, you could probably get a pretty good approximation by rendering and combining multiple frames on the card and only pushing a lower frame rate out to the monitor – that is, you only really need beefy rendering hardware, not a fancy monitor or cable, to get pretty close. I suppose that in theory, a compositor could do that. I don’t know if someone’s already done that or not.