I decided to use Cycles instead of Eevee (slow ass raytraced vs. realtime rasterized if that makes any sense to you) to see how it looked, and I was impressed by the difference it made. It just looks and feels more full and interesting now.
Raytracing is making its way into realtime applications. It has been a research topic for years. You have consider the sheer amount of work that goes into raytracing though.
Your camera has a particular field of view. It uses a notion of outgoing "rays" from the camera's perspective to see what the camera sees. For each ray that intersects a triangle on some object, if the surface shader applied to that object can "reflect" light, we need to check what it reflects.
At some point, we either hit a light or need to assume that sample is shrouded in darkness.
Now, at each of these object intersection phases, we also have to do a lot of expensive math. For any smooth shaded object or any object that can reflect anything, you need to know the angle between the ray and the surface normal of that object so you know where to look for whatever might be reflecting it. At some point along the way, you also have to use that information to compute the contribution of each shader in the shader stack of that object.
Each step of this is pretty damn expensive, considering that functions like arc cos on a system that is consistent with the transcendental guidelines of IEEE2008a (roughly states that they should be accurate to 16 decimal places on double precision), each of those might take 100 cycles or more on their own. I checked some instruction tables and a lot of the vendor tuned ones from intel/amd seem to take 100-200 cycles. If any division or square roots are used for ray normalization, it could easily add 100 more for a very small piece of logic on a single intersection. Ignoring that, you have huge potential latency on fetching the necessary geometry, since these things don't follow patterns that your underlying runtime would know how to prefetch.
So yeah... raytracing costs a lot

.