If we really want to reach back in history, you could say they've been waiting much longer, since the i740 was released 24 years ago.
Listening to the more reliable leakers on the PC side, the problem seems to be less with the hardware and more due to immature drivers. I'm sure Intel would have liked to have launched much sooner, but one of the reasons why the performance desktop parts are being held back is because game performance and stability just isn't there yet. Intel is only getting one shot at making a good impression, and they don't want a repeat of what Matrox did with the Parhelia, which they never recovered from.
Regardless of what caused the delay, the timing is terrible. Graphics cards are just now coming down in price, so the window to take advantage of shortages to gain market share is closing. The performance is likely to be, at best, between a 3060 and 3070 at the top end. By the time Intel does launch discrete cards, Nvidia Lovelace and AMD RDNA3 will be close to release, not to mention refreshes of the current lines.
I appreciate Intel not wanting to rush product to market, but it seems that they are playing catch-up, as is tradition.
I was skeptical that we'd see performance GPUs inside the M1 series. We have spent years being conditioned that integrated graphics = bad. A lot of folks just refused to believe that Apple would dump Intel, forget about leaving AMD and having desktop level GPU performance on the same package. When asked about it, I remember Lisa Su meekly stating, "We are the graphics partner of Apple", which is technically still true with add-in cards for the Mac Pro.
The latest 3090 Ti is pushing 450w, and Nvidia allegedly has plans to go up to 600w with Lovelace, with some specialty cards hitting 800w. That's simply unsustainable and something is going to have to give. Maybe there will always be standalone GPUs, like you can still get a sound card, but everything is becoming more integrated, not less.