diamond.g
Site Champ
- Joined
- Dec 24, 2021
- Posts
- 940
That is lame, is it Ubisoft's fault that you have to download the whole thing again?Also, AC Shadows update; 115GB for me too
That is lame, is it Ubisoft's fault that you have to download the whole thing again?Also, AC Shadows update; 115GB for me too
Frankly I'm not fully sure on the specifics of delta updates on the App Stores. If it for example fully replaces all files that have changes, and they performed an automated updated to all their resource files, it's not unreasonable all files have been touched and the App Store then cannot just patch them.That is lame, is it Ubisoft's fault that you have to download the whole thing again?
Most of the time it is actually running at 30-40FPS - In a standard run of the benchmark, there are 1-4 stutters below 30FPS otherwise it's in the 30-40FPS range on my full M4 Max with basically mid-low settings 1080p (That is, 4K MetalFX at 25% render resolution) - But settings don't seem to scale much. All low vs all high barely budges that needle, and turning off chromatic aberration and setting ray tracing to high instead of medium *improved* performance by 4FPS; Which is bizarre. All of this is using the built-in benchmarkI had a quick look at the metal capture provided by @casperes1996 (the small one, not the large one). There is some variance in frame durations — most are excellent at 6.25ms, but a couple drop down to 30-40ms, which is under 30 FPS. Is that the performance issues you are talking about?
Zooming into the shader timeline, it seems like a lot of interdependencies and not much parallelism. Most of the time seems to be spent in the compute shader. I don’t really see anything too conspicuous here, but it’s not like I have too much experience reading these graphs either.
I am beginning to suspect the weather system could be the culprit. There are no settings for it, and it is resolution independent, which could explain why reducing settings or resolution doesn't appear to improve things much.Most of the time it is actually running at 30-40FPS - In a standard run of the benchmark, there are 1-4 stutters below 30FPS otherwise it's in the 30-40FPS range on my full M4 Max with basically mid-low settings 1080p (That is, 4K MetalFX at 25% render resolution) - But settings don't seem to scale much. All low vs all high barely budges that needle, and turning off chromatic aberration and setting ray tracing to high instead of medium *improved* performance by 4FPS; Which is bizarre. All of this is using the built-in benchmark
Definitely not seeing the frame durations of 6.25ms - As I said I feel like there were a lot of oddities with the data I got from the traces vs. what I actually experienced and also in the larger trace itself. I am not sure if what we're seeing are intermediate renders or if it's just because of a massive wait where the GPU is done and waiting on a synchronisation point of some kind - CPU utilisation is through the roof all the time too, using like 1,200% to 1,600% of my M4 Max throughout the benchmark. Given this game also runs on PS5 which is like 8 Zen 2 cores, it's pushing CPU a lot harder than I would expect. I would expect it to easily scale to as big a GPU budget as you can give it, with higher resolutions and more ray bounces, etc. - but I would think it somewhat limited how much more CPU it could utilise than its minimum supported target. And I don't see a way to really scale that down much other than perhaps reducing the quantity of physics objects in the world at any given time by reducing load in range for objects, but I really would expect to be purely GPU bound
Ooof, that isn't a great result at all. I wonder if they have Specular turned on. Last I looked that setting drops the framerate a lot compared to just diffuse everywhere by itself.Just saw a review of Razer’s new 5090 based laptop
1600P, Medium settings, DLSS on = 50fps
Yikes. Clearly some optimization needed.
View attachment 34378
I have never bought an Assassins Creed game and have no time to play but feel the sudden urge to buy the newest game ...
(Elon is complaining about the game being "woke" btw)
AC Shadows uses dynamic resolution scaling. I’m curious if this has an effect on the use of MetalFX scaling? Or do they mean literally the display is switching resolution, and not the output resolution of an app?
Upscale a rendering by following these steps for every render pass:
- Set the temporal scaler’s colorTexture property to the input texture.
- Set the scaler’s inputContentWidth and inputContentHeight properties.
- Set the scaler’s outputTexture property to your destination texture.
- Encode the upscale commands to a command buffer by calling the temporal scaler’s encode(commandBuffer
method.
When you say output do you mean the final image sent to the monitor, or do you mean the output that is to be upscaled by metalfx (so the render resolution)?The way I read this is that you shouldn't be calling `makeTemporalScaler(device` repeatedly. So it depends on what you can define in the description in terms of input/output settings.
But according to the docs, it should support dynamic scaling just fine, because you can update the scaler's properties each render pass to let it know how big the input texture is:
The catch is that the output size appears to be fixed and cannot be adjusted on the fly, only from the descriptor. So you'd need to update the descriptor and call makeTemporalScaler() again if the screen resolution you are targeting changes.
Output as I read it is what the MetalFX upscale outputs to its out-texture. Input is the render resolution sent to the upscaler. The Output isn't necessarily equal to what is actually sent to the monitor. It'll usually be rendered to a texture that is then composited on top up before sending it to the monitor to add UI elements at native render resolution before outputtingWhen you say output do you mean the final image sent to the monitor, or do you mean the output that is to be upscaled by metalfx (so the render resolution)?
Shouldn't metalfx's output be static then? In the case of dynamic resolution I was under the impression that the input (render) resolution changes, not the output. Is that not the case?Output as I read it is what the MetalFX upscale outputs to its out-texture. Input is the render resolution sent to the upscaler. The Output isn't necessarily equal to what is actually sent to the monitor. It'll usually be rendered to a texture that is then composited on top up before sending it to the monitor to add UI elements at native render resolution before outputting
That also fits with what @Nycturne said. You can dynamically adjust the render resolution per frame on a single metalFX scaler object but it you move the game to a different screen or change screen resolution or otherwise alter the output texture you need to create a new scaler object.Shouldn't metalfx's output be static then? In the case of dynamic resolution I was under the impression that the input (render) resolution changes, not the output. Is that not the case?
That also fits with what @Nycturne said. You can dynamically adjust the render resolution per frame on a single metalFX scaler object but it you move the game to a different screen or change screen resolution or otherwise alter the output texture you need to create a new scaler object.
This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
By continuing to use this site, you are consenting to our use of cookies.