Intel finally releasing discrete GPU

Cmaier

Site Master
Staff Member
Site Donor
Posts
5,211
Reaction score
8,256

This is very very late in the game, and is something that Intel has been working on for a very long time. Finally releasing these now is thinking small. Yeah, Intel will likely make enough money on this to recoup the 15 years of R&D that they’ve put into it, but this is not very forward-looking. What Intel should have learned from apple is that the future is likely to be integrated CPU/GPU packages, probably with a unified memory architecture. Discrete GPUs are already somewhat niche, and that’s going to be even more the case in 5 years.
 

Citysnaps

Elite Member
Staff Member
Site Donor
Posts
3,601
Reaction score
8,819
Main Camera
iPhone
Seems Intel should have pivoted back when they learned what Apple was doing.
 

Yoused

up
Posts
5,511
Reaction score
8,683
Location
knee deep in the road apples of the 4 horsemen
One of the stories that links off of that story, from a year ago, says that the Intel Arc cards will not work with an AMD based system. And I can understand that Intel wants to protect/promote their own brand, but unless Arc is stunningly better than the alternatives, they seem to be doing themselves a bit of a disservice with such a strategery.
 

Cmaier

Site Master
Staff Member
Site Donor
Posts
5,211
Reaction score
8,256
One of the stories that links off of that story, from a year ago, says that the Intel Arc cards will not work with an AMD based system. And I can understand that Intel wants to protect/promote their own brand, but unless Arc is stunningly better than the alternatives, they seem to be doing themselves a bit of a disservice with such a strategery.

Seems confusing how that could even be the case.
 

Yoused

up
Posts
5,511
Reaction score
8,683
Location
knee deep in the road apples of the 4 horsemen
Seems confusing how that could even be the case.
here:
“The Iris Xe discrete add-in card will be paired with 9th gen (Coffee Lake-S) and 10th gen (Comet Lake-S) Intel® Core™ desktop processors and Intel(R) B460, H410, B365, and H310C chipset-based motherboards and sold as part of pre-built systems,” says an Intel spokesperson in a statement to Legit Reviews. “These motherboards require a special BIOS that supports Intel Iris Xe, so the cards won’t be compatible with other systems.”
which may be different with this year's models
 

Colstan

Site Champ
Posts
822
Reaction score
1,124
Intel will likely make enough money on this to recoup the 15 years of R&D that they’ve put into it, but this is not very forward-looking.
If we really want to reach back in history, you could say they've been waiting much longer, since the i740 was released 24 years ago.

Listening to the more reliable leakers on the PC side, the problem seems to be less with the hardware and more due to immature drivers. I'm sure Intel would have liked to have launched much sooner, but one of the reasons why the performance desktop parts are being held back is because game performance and stability just isn't there yet. Intel is only getting one shot at making a good impression, and they don't want a repeat of what Matrox did with the Parhelia, which they never recovered from.

Regardless of what caused the delay, the timing is terrible. Graphics cards are just now coming down in price, so the window to take advantage of shortages to gain market share is closing. The performance is likely to be, at best, between a 3060 and 3070 at the top end. By the time Intel does launch discrete cards, Nvidia Lovelace and AMD RDNA3 will be close to release, not to mention refreshes of the current lines.

I appreciate Intel not wanting to rush product to market, but it seems that they are playing catch-up, as is tradition.

What Intel should have learned from apple is that the future is likely to be integrated CPU/GPU packages, probably with a unified memory architecture. Discrete GPUs are already somewhat niche, and that’s going to be even more the case in 5 years.
I was skeptical that we'd see performance GPUs inside the M1 series. We have spent years being conditioned that integrated graphics = bad. A lot of folks just refused to believe that Apple would dump Intel, forget about leaving AMD and having desktop level GPU performance on the same package. When asked about it, I remember Lisa Su meekly stating, "We are the graphics partner of Apple", which is technically still true with add-in cards for the Mac Pro.

The latest 3090 Ti is pushing 450w, and Nvidia allegedly has plans to go up to 600w with Lovelace, with some specialty cards hitting 800w. That's simply unsustainable and something is going to have to give. Maybe there will always be standalone GPUs, like you can still get a sound card, but everything is becoming more integrated, not less.
 

diamond.g

Power User
Posts
231
Reaction score
85
One of the stories that links off of that story, from a year ago, says that the Intel Arc cards will not work with an AMD based system. And I can understand that Intel wants to protect/promote their own brand, but unless Arc is stunningly better than the alternatives, they seem to be doing themselves a bit of a disservice with such a strategery.
That was for DG1 (which was never released to the general public). DG2 (Desktop) won't have that limitation. I'm still skeptical about the drivers, but I guess we will see how that goes.
 

diamond.g

Power User
Posts
231
Reaction score
85
Commenters say this is good :)
I didn't even look at the comments. But now that I have they did, until pricing came into the picture, lol.

I am not sure how I feel about the results given this is the top of the bottom end. I have to keep reminding myself that Alchemist is only supposed to compete against the "Enthusiast" tier at best. We wont see anything from Intel that can touch the 3080 series cards until either Battlemage or Celestial.

Intel-ARC-Alchemist-DG2-GPU-Desktop-Graphics-Card-Lineup.jpg
 

Cmaier

Site Master
Staff Member
Site Donor
Posts
5,211
Reaction score
8,256
I think it‘s “fine,” but I don’t get why they want to compete in the market at the level of “fine.”
 

diamond.g

Power User
Posts
231
Reaction score
85
I think it‘s “fine,” but I don’t get why they want to compete in the market at the level of “fine.”
Captive audience. Intel can make pricing deals to OEMs (like they did for CPU's) so that they choose Intels dGPU's instead of AMD or Nvidia.

And for most people there won't be any difference cause most people don't play games, and for those that do it is close enough to get the job done, I guess.
 

Huntn

Whatwerewe talk'n about?
Site Donor
Posts
5,254
Reaction score
5,189
Location
The Misty Mountains
If we really want to reach back in history, you could say they've been waiting much longer, since the i740 was released 24 years ago.

Listening to the more reliable leakers on the PC side, the problem seems to be less with the hardware and more due to immature drivers. I'm sure Intel would have liked to have launched much sooner, but one of the reasons why the performance desktop parts are being held back is because game performance and stability just isn't there yet. Intel is only getting one shot at making a good impression, and they don't want a repeat of what Matrox did with the Parhelia, which they never recovered from.

Regardless of what caused the delay, the timing is terrible. Graphics cards are just now coming down in price, so the window to take advantage of shortages to gain market share is closing. The performance is likely to be, at best, between a 3060 and 3070 at the top end. By the time Intel does launch discrete cards, Nvidia Lovelace and AMD RDNA3 will be close to release, not to mention refreshes of the current lines.

I appreciate Intel not wanting to rush product to market, but it seems that they are playing catch-up, as is tradition.


I was skeptical that we'd see performance GPUs inside the M1 series. We have spent years being conditioned that integrated graphics = bad. A lot of folks just refused to believe that Apple would dump Intel, forget about leaving AMD and having desktop level GPU performance on the same package. When asked about it, I remember Lisa Su meekly stating, "We are the graphics partner of Apple", which is technically still true with add-in cards for the Mac Pro.

The latest 3090 Ti is pushing 450w, and Nvidia allegedly has plans to go up to 600w with Lovelace, with some specialty cards hitting 800w. That's simply unsustainable and something is going to have to give. Maybe there will always be standalone GPUs, like you can still get a sound card, but everything is becoming more integrated, not less.
As someone who is not learned on the topic, but who has a gaming PC with an i5 and a GTX 2070, doesn't this all boils down to technology and physics? In other words, to achieve the desired graphic effects requires the circuitry and power whether it is in a stand alone card or integrated? All that will alter this are breakthroughs in technology.

Back 20 years ago, I never felt the need to have the top of the line graphic card, because I never could justify the expense in my head and that was when cards ran about $200! I suspect this is a marketing distortion, but then it seemed that you needed the new expensive cards to run the latest games you wanted to play without struggling. The most I’ve spent on a card is about $500 which is too much imo.
 

diamond.g

Power User
Posts
231
Reaction score
85
Back 20 years ago, I never felt the need to have the top of the line graphic card, because I never could justify the expense in my head and that was when cards ran about $200!
Yeah it really depends on what kinds of games you are wanting to play. Well that and how pretty (and how many frames you want) in said games.
 

Huntn

Whatwerewe talk'n about?
Site Donor
Posts
5,254
Reaction score
5,189
Location
The Misty Mountains
Yeah it really depends on what kinds of games you are wanting to play. Well that and how pretty (and how many frames you want) in said games.
When playing with lesser cards, the games seemed to play well enough. :) It was just a few of the over the top brand new games that were melting GPUs, and look at New World, my understanding is there are some pissed players out there with a glob that used to be their graphic card. :)
 

diamond.g

Power User
Posts
231
Reaction score
85
When playing with lesser cards, the games seemed to play well enough. :) It was just a few of the over the top brand new games that were melting GPUs, and look at New World, my understanding is there are some pissed players out there with a glob that used to be their graphic card. :)
The joys of scalability!
 
Top Bottom
1 2