A17 has ray tracing!

tomO2013

Power User
Posts
111
Reaction score
195
The apple keynote just wrapped.

I’m incredibly excited to see ray tracing support in A17 and what this will mean for desktop computing when it makes its way into M3, M3 Pro, M3 Max, M3 Ultra, M3 Bodacious.

I suspect over at the other place, disappointment will prevail because it‘s unlikely to match an nvidia 4090 in performance (I’m half joking with that statement because I think that expectation is personally ridiculous).

It also speaks to apples approach to Ray Tracing even on the desktop - it is highly unlikely to be targeting 4090 performance metrics at the expense efficiency and performance per watt.

I’ve long speculated that this Ray Tracing implementation is based on Imagination Technologies PowerVR Photon intellectual property licensed from Imagination Technologies as if so, it’s incredibly power efficient as well as performant for scalable use cases (small devices up to large desktop).

Can’t wait to see an anandtech breakdown on Apples A17 implementation :)

Figured this was worthy of it’s own thread - mods feel free to delete if you feel that a more generic A17 discussion is warranted!
 
Last edited:

Andropov

Site Champ
Posts
674
Reaction score
890
Location
Spain
It's a bit funny because of all the devices they offer the iPhone is likely the one were you're less likely to notice the subtle lighting differences of raytracing. Yet because they reuse the core designs in all chips, and since the iPhone one always goes first, the initial hardware raytracing implementation had to debut here. But it's going to be very exciting when this reaches the M3 line.
 

Yoused

up
Posts
5,884
Reaction score
9,491
Location
knee deep in the road apples of the 4 horsemen
I’ve long speculated that this Ray Tracing implementation is based on PowerVR Photo intellectual property licensed from Imagination Technologies
I suspect that Apple and Imagination have been at least in part collaborating on hardware design, giving Apple a favorable license.

Also, I think I read that RT uses the neural engine, presumably to optimize the process by reducing dead-ends.
 

theorist9

Site Champ
Posts
656
Reaction score
619
Broadly speaking, what types of applications will be able to take advantage of hardware RT (HRT) , and what kind of effect will it have? On the general iPhone 15 prediction thread, @Altaic mentioned HRT should work in Blender, and that the naive approximation for its expected effect is a 4x speed increase.

However, its effect on games seems to be very different. I know little about games, but from what I've read here ( https://www.rockpapershotgun.com/wa...ray-tracing-is-almost-unplayable-without-dlss ), at least in games, RT isn't something that allows you to render the same thing faster (which is what Altaic was suggesting for Blender), but rather something that allows you to render a scene differently—with much better quality—at the expense of speed, even if your GPU has HRT. For instance, compare these two scenes from Watch Dogs Legion. Using ray tracing at the highest setting does markedly increases quality. But, even with an HRT-enabled GPU like the RTX3070, it reduces frame rate by nearly half (at these specific settings) (see screenshots, taken from linked article).

That, according to the article, is why NVIDIA developed DLSS: It significantly reduces RT's performance hit while maintaining visual quality.

But maybe Apple's implementation of HRT is different from NVIDIA's, such that it can implement it without needing an add-on like DLSS to avoid the performance hit.

Another general question: If Apple's implementation of HRT is very different from that of NVIDIA and AMD, does that mean existing RT games (like those on this list: https://www.rockpapershotgun.com/confirmed-ray-tracing-and-dlss-games ) would need their RT code to be substantially rewritten to take advantage of HRT on AS?

NO RT. WITH GEN'L ULTRA SETTING, RTX3070 GIVES FRAME RATE OF 71 FPS WITH 1440P
1694546540345.png


WITH RT AT HIGHEST (ULTRA) SETTING. WITH GENL. ULTRA SETTING, RTX3070 GIVES FRAME RATE OF 37 FPS WITH 1440P
1694546497181.png
 
Last edited:

dada_dave

Elite Member
Posts
2,448
Reaction score
2,474
Broadly speaking, what types of applications will be able to take advantage of hardware RT (HRT) , and what kind of effect will it have? On the general iPhone 15 prediction thread, @Altaic mentioned HRT should work in Blender, and that the naive approximation for its expected effect is a 4x speed increase.

However, its effect on games seems to be very different. I know little about games, but from what I've read here ( https://www.rockpapershotgun.com/wa...ray-tracing-is-almost-unplayable-without-dlss ), at least in games, RT isn't something that allows you to render the same thing faster (which is what Altaic was suggesting for Blender), but rather something that allows you to render a scene differently—with much better quality—at the expense of speed, even if your GPU has HRT. For instance, compare these two scenes from Watch Dogs Legion. Using ray tracing at the highest setting does markedly increases quality. But, even with an HRT-enabled GPU like the RTX3070, it reduces frame rate by nearly half (at these specific settings) (see screenshots, taken from linked article).

That, according to the article, is why NVIDIA developed DLSS: It significantly reduces RT's performance hit while maintaining visual quality.

NO RT. WITH GEN'L ULTRA SETTING, RTX3070 GIVES FRAME RATE OF 71 FPS WITH 1440P
View attachment 25861

WITH RT AT HIGHEST (ULTRA) SETTING. WITH GENL. ULTRA SETTING, RTX3070 GIVES FRAME RATE OF 37 FPS WITH 1440P
View attachment 25860
With Blender and other renderers the assumption is that you are going to want the most accurate lighting model you can reasonably get and since ray tracing has been a thing on GPUs that’s the gold standard and accelerating in hardware yields performance improvements. Games though still have it as an optional feature and often it can be only subtly better than the standard techniques to do lighting and shadows. As the technology and developer familiarity with the new lighting techniques improves and matures that should get better, but as @Andropov said small phone scenes with the limitations of computational power on the phone (as impressive as it is!) will probably mean the phone isn’t going to be any kind of ray tracing powerhouse where suddenly all your games look that much better.
 
Last edited:

leman

Site Champ
Posts
722
Reaction score
1,374
I suspect that Apple and Imagination have been at least in part collaborating on hardware design, giving Apple a favorable license.

There is some overlap, at least judging by patents. IMG has invented the idea of using low-precision intersection for energy-efficient RT, Apple took it further and added some async shader magic to it.

Also, I think I read that RT uses the neural engine, presumably to optimize the process by reducing dead-ends.

You can potentially use the NPU for image denoising to reduce the amount of RT work. The NPU does not participate in RT as such. I do not know if Apples current denoisers utilize the NPU.
 

leman

Site Champ
Posts
722
Reaction score
1,374
But maybe Apple's implementation of HRT is different from NVIDIA's, such that it can implement it without needing an add-on like DLSS to avoid the performance hit.

Apple’s implementation is much more optimized for low energy consumption, but RT still requires much more work that traditional pipelines. Upscaling really helps delivering good performance. That’s why Apple has its own upscaling tech that leverages its ML processors.

I think the big selling point of Apple Silicon will be that it will enable useable RT on low-end hardware. With Nvidia you still need to get a fairly beefy GPU if you want to use RT. But I wouldn’t be surprised if the base M3 Air will already do decently (with upscaling of course).


Another general question: If Apple's implementation of HRT is very different from that of NVIDIA and AMD, does that mean existing RT games (like those on this list: https://www.rockpapershotgun.com/confirmed-ray-tracing-and-dlss-games ) would need their RT code to be substantially rewritten to take advantage of HRT on AS?

No. Metal’s RT APIs are practically identical to DX12 RT.
 

dada_dave

Elite Member
Posts
2,448
Reaction score
2,474
Apple also said the shader architecture was redesigned so it’ll be interesting to see what they meant by that, presuming it means more than adding the RT cores they explicitly mentioned.
 

Citysnaps

Elite Member
Staff Member
Site Donor
Posts
3,901
Reaction score
9,526
Main Camera
iPhone
I suspect over at the other place, disappointment will prevail because it‘s unlikely to match an nvidia 4090 in performance (I’m half joking with that statement because I think that expectation is personally ridiculous).

Disappointment will always prevail at the other place for the tiniest mice-nuts reasons.

Today there's apparently a competition of the bored about who can muster up the largest eye-roll regarding Apple's product announcements.
 

tomO2013

Power User
Posts
111
Reaction score
195
During the keynote they mentioned a radical overhaul of their shaders….

This got me thinking about this demo from Imagination Technologies…. Where they demo on their latest IMG DXT tech stack (evolution from photon) their new fragment shading rate (FSR) where they are reducing the number of times the fragment shader runs on your scene without significant visual loss of fidelity.










I wonder if Apples licensing deal with IMG would have allowed IP from IMG DXT to be included in the A17 GPU….

I‘m surmising that Apple‘s licensing deal would give them with significantly advanced awareness of IMG’s announcment with early access - certainly much earlier access than the announcement 8 months ago. Would this be sufficient for the tape out of A17 design or something we’d see in A18/A19 timeframe… no idea!

I have to say, the more that I learn about IMG‘s approach to ray tracing versus to nVidia and AMD, the more excited and hopeful I am that a derivative is indeed included in A17.
 

dada_dave

Elite Member
Posts
2,448
Reaction score
2,474
During the keynote they mentioned a radical overhaul of their shaders….

This got me thinking about this demo from Imagination Technologies…. Where they demo on their latest IMG DXT tech stack (evolution from photon) their new fragment shading rate (FSR) where they are reducing the number of times the fragment shader runs on your scene without significant visual loss of fidelity.










I wonder if Apples licensing deal with IMG would have allowed IP from IMG DXT to be included in the A17 GPU….

I‘m surmising that Apple‘s licensing deal would give them with significantly advanced awareness of IMG’s announcment with early access - certainly much earlier access than the announcement 8 months ago. Would this be sufficient for the tape out of A17 design or something we’d see in A18/A19 timeframe… no idea!

I have to say, the more that I learn about IMG‘s approach to ray tracing versus to nVidia and AMD, the more excited and hopeful I am that a derivative is indeed included in A17.

Perhaps though they did say “Apple designed shaders” that could cover a multitude of meanings.
 

leman

Site Champ
Posts
722
Reaction score
1,374
I wonder if Apples licensing deal with IMG would have allowed IP from IMG DXT to be included in the A17 GPU….

Apples RT patents are at least partially based on IMG RT patents, but Apple brings in a lot of their own sauce to the game. Apple GPUs, up to A16, are hybrids of some IMG technology (like rasterization etc.) and Apple-designed SIMD shader units. People who reverse-engineered Apple GPUs to write Linux drivers mentioned that the low-level driver interface resembles Rogue GPUs from IMG. But shader architecture is custom Apple.

We will have to wait and see what are the new GPU features on A17 to make educated guesses about the architecture (documentation should be one available over the next days).

As far as FSR goes, Apple has offered a similar technology since A12 if I remember correctly.
 

Nycturne

Elite Member
Posts
1,184
Reaction score
1,595
Broadly speaking, what types of applications will be able to take advantage of hardware RT (HRT) , and what kind of effect will it have? On the general iPhone 15 prediction thread, @Altaic mentioned HRT should work in Blender, and that the naive approximation for its expected effect is a 4x speed increase.

As @dada_dave pointed out, there's two scenarios. Apple explicitly compared software RT vs hardware RT, hence the 4x speed increase comment.

NO RT. WITH GEN'L ULTRA SETTING, RTX3070 GIVES FRAME RATE OF 71 FPS WITH 1440P
View attachment 25861

WITH RT AT HIGHEST (ULTRA) SETTING. WITH GENL. ULTRA SETTING, RTX3070 GIVES FRAME RATE OF 37 FPS WITH 1440P
View attachment 25860

Take note of these two screenshots. Look closely enough and you can catch the trick used to generate the upper screenshot: It's not even reflection. It's clever, but ultimately a trick of showing something at low resolution, adding opacity and masking to the render pass to mimic reflection. Screen space reflections are another trick but it has other visual artifacts as it can only reflect what's already been projected into screen space (and is visible to the user). Adding in path tracing enables more faithful reflections and lighting, that are also fully dynamic in a way that other techniques may not be able to support. But it's more work, and gives you lower framerates. DLSS / FSR / MetalFX upscaling help hide this performance hit by letting the game render at lower resolutions than the screen target and using ML upscaling to keep details sharper than they would be with something like bilinear scaling.

Ultimately, doing something dynamically always impacts framerate when talking about games. Precomputed lighting, tricks to mimic reflections, etc enable faster framerates because they do less. And a good team can hide a lot of the rough edges to avoid breaking immersion. Very similar to a team working on a film that creates sets and props that hide the fact that the set was built on a studio lot, and the swords are made of foam or plastic.
 

exoticspice1

Site Champ
Posts
328
Reaction score
132
I still don't get with the GPU redesign, it only improved by 20%. They added an extra core, so the improvements they made are not for performance?

Is that why they added an extra core
 

leman

Site Champ
Posts
722
Reaction score
1,374
I still don't get with the GPU redesign, it only improved by 20%. They added an extra core, so the improvements they made are not for performance?

Is that why they added an extra core

A15 and A16 have significant problems with sustained GPU performance. Their speed drops by 30% when you use the GPU for longer time. My hope is that the GPU is conservatively clocked to fix this.

Edit: at any rat, clocks will be the key to understanding the performance potential of the new architecture. If the CPU/GPU is clocked similarly to A16, that's not good news. If it's clocked similar to A15, that's great news. If it's clocked similar to A14, that's amazing news.
 
Last edited:

dada_dave

Elite Member
Posts
2,448
Reaction score
2,474
A15 and A16 have significant problems with sustained GPU performance. Their speed drops by 30% when you use the GPU for longer time. My hope is that the GPU is conservatively clocked to fix this.

Edit: at any rat, clocks will be the key to understanding the performance potential of the new architecture. If the CPU/GPU is clocked similarly to A16, that's not good news. If it's clocked similar to A15, that's great news. If it's clocked similar to A14, that's amazing news.
A priori I agree but of course it would be best to wait for performance vs energy cost analyses. Sadly I don’t know if we’ll get any as Andrei is gone from Anandtech and it’s hard to find any site that does it as well as he did for them, especially for phones where it is especially difficult to do right.
 

leman

Site Champ
Posts
722
Reaction score
1,374
A priori I agree but of course it would be best to wait for performance vs energy cost analyses. Sadly I don’t know if we’ll get any as Andrei is gone from Anandtech and it’s hard to find any site that does it as well as he did for them, especially for phones where it is especially difficult to do right.

it's unlikely that we will get a review at the level what Andrei used to write, but one can infer quite a lot from indirect observations (e.g. performance over time, chassis temperature etc.).
 

Altaic

Power User
Posts
191
Reaction score
239
A priori I agree but of course it would be best to wait for performance vs energy cost analyses. Sadly I don’t know if we’ll get any as Andrei is gone from Anandtech and it’s hard to find any site that does it as well as he did for them, especially for phones where it is especially difficult to do right.
In lieu of Andrei, Maynard Handley and Philip Turner are excellent, self motivated analysts for these sorts of things. Also the Asahi folks. Maybe someone could reach out and we could make it worth their time?
 

dada_dave

Elite Member
Posts
2,448
Reaction score
2,474
it's unlikely that we will get a review at the level what Andrei used to write, but one can infer quite a lot from indirect observations (e.g. performance over time, chassis temperature etc.).

In lieu of Andrei, Maynard Handley and Philip Turner are excellent, self motivated analysts for these sorts of things. Also the Asahi folks. Maybe someone could reach out and we could make it worth their time?

Yeah … there are some people/sites that do appear to be able to do really good analysis, I’d add chipsandcheese as well a couple of PC reviewers to your list, but they don’t necessarily do this particular kind or they’re scattershot in terms of products so you don’t know when or what they’ll publish on or they simply don’t do mobile/Apple, not necessarily out of any personal chauvinism (sometimes) but simply not their business. I dunno. We’ll see.

Like the Asahi folks have no interest in Apple’s mobile products and while Handley published his M1 omnibus I don’t think he has done this kind of analysis, not for mobile. You never know what chipsandcheese will publish on or what hardware they’ll have on hand. And the PC folks are well, PC focused. I don’t know Turner, I’ll be honest. I looked him up, seems interesting but I didn’t see any of this kind of stuff but I didn’t look too long.
 
Last edited:
Top Bottom
1 2