But this was all relatively minor compared with everything else presented at WWDC22 related to Metal (Metal FX Upscaling, new workflows to pre-compile shaders, geometry shaders, a C++ API for Metal...), most of it unrelated to ray tracing. Nothing groundbreaking on the raytracing field that would suggest they're leveling the field for a grand reveal. Maybe on WWDC23?
My impression is that the raytracing API in Metal is already rather mature. It is fairly similar to what DX12 Ultimate or Vulkan extensions offer, and I think even more capable in some key areas (I don't think that DX12 offers motion blur APIs or in-node storage, plus Metal RT has larger data structure limits more suitable fro production ray tracing). So lacks of big upgrades might simply mean that the API is "done".
Anyhow, while the exact details of the Apple Silicon architecture weren't discussed in WWDC20, they unveiled it was coming. I guess they could do the same here: announce raytracing is coming (and how the new APIs work, or how to improve the existing code to tailor it to the new hardware), and release the first commercial products that use it later. But it would be a bit awkward without something like the Developer Transition Kit they announced in WWD20 for the Apple Silicon transition.
I would be surprised if their RT API was not developed with the upcoming hardware in mind. Most likely, the code will just work on new GPUs with hardware acceleration, with no changes or tweaks required.