M4 Mac Announcements

How much does interpolation cost compared to straight frame rendering? Seems to me game video does not have a lot of cut-like full image changes, so it seems like interpolation could be a big gain if it is cheap enough.
 
How much does interpolation cost compared to straight frame rendering? Seems to me game video does not have a lot of cut-like full image changes, so it seems like interpolation could be a big gain if it is cheap enough.
It depends: are you talking about generating frame A, then frame B, then frames in between? That has a pretty hard limit on how much that helps because you still have to wait until frame B is done. That’s why the new AI techniques of frame generation are generate frame A, then generate 2-4 frames forwards in time based on a guess about what scene should like while real frame B is being generated.

This works well except input lag becomes very noticeable, especially at low non-frame generation frame rates. In other words, if you can’t play the game well “natively” (or with just standard DLSS-type optimizations) this kind of frame generation won’t deliver a playable game. If your computer can play the game well and then you turn on frame generation then you can get even smoother visuals (but still your input latency won’t improve for games where that matters). This AI frame generation is thus basically a “win more” for your graphics card - even smoother graphics for those already able to play smoothly.
 
It depends: are you talking about generating frame A, then frame B, then frames in between? That has a pretty hard limit on how much that helps because you still have to wait until frame B is done. That’s why the new AI techniques of frame generation are generate frame A, then generate 2-4 frames forwards in time based on a guess about what scene should like while real frame B is being generated.

This works well except input lag becomes very noticeable, especially at low non-frame generation frame rates. In other words, if you can’t play the game well “natively” (or with just standard DLSS-type optimizations) this kind of frame generation won’t deliver a playable game. If your computer can play the game well and then you turn on frame generation then you can get even smoother visuals (but still your input latency won’t improve for games where that matters). This AI frame generation is thus basically a “win more” for your graphics card - even smoother graphics for those already able to play smoothly.

It depends: are you talking about generating frame A, then frame B, then frames in between? That has a pretty hard limit on how much that helps because you still have to wait until frame B is done. That’s why the new AI techniques of frame generation are generate frame A, then generate 2-4 frames forwards in time based on a guess about what scene should like while real frame B is being generated.

This works well except input lag becomes very noticeable, especially at low non-frame generation frame rates. In other words, if you can’t play the game well “natively” (or with just standard DLSS-type optimizations) this kind of frame generation won’t deliver a playable game. If your computer can play the game well and then you turn on frame generation then you can get even smoother visuals (but still your input latency won’t improve for games where that matters). This AI frame generation is thus basically a “win more” for your graphics card - even smoother graphics for those already able to play smoothly.
Above answer by dada_dave is very good I'd just like to add a little to it.

The cost of doing the frame interpolation/generation is fairly cheap. You generally get very very close to doubling your frame rate by turning it on in most cases, so the cost of frame interpolation isn't much of a consideration I think. It is still a matter of the tradeoffs; As dada says; Frame interpolation increases smoothness, but your input latency isn't helped by it. If you natively run the game at 30FPS, frame interpolation may basically get you to 60FPS, but your input latency may now be the equivalent of running at 28FPS - In total this may actually be a perceivably worse experience than just VSYNC-ing to 30FPS and having less smooth visuals, but slightly better input latency than with the incurred cost of frame interpolation.

The interpolated frames also aren't exactly perfect every time, you can get some visual artefacts with it.

In my opinion, but once again this is quite subjective, frame interpolation/generation is worth it if you have a VRR/high refresh rate monitor and you're playing natively with ~40-45FPS or above. Getting the smoothness of 80+ FPS and still having input latency around the *40FPS mark is usually pretty good to me.

*Note: 40FPS obviously isn't a measure of input latency, but as the generation speed of frames is tied to input latency as well, what I mean is that the usual input latency, for most games, while running at about 40FPS, is typically fine to me, and noticeably better than 30FPS, but not meaningfully different to 60 - I play almost exclusively with controllers, not mouse and keyboard
 
Back
Top