Intel to launch pay-as-you-go CPU features

Colstan

Site Champ
Posts
822
Reaction score
1,124
Intel was already doing something similar with their overclocking K-series of chips, because enthusiasts need Intel's blessing with imposed artificial limitations; this is just an extension of where they were headed. We are ever increasingly living in a digital subscription world. Anything that can be regularly monetized, will be. That is why I appreciated Apple's business model. A Mac is a premium product, but the cost up front is not just for the computer, but the engineering behind Apple Silicon, and of course macOS, and its related software. The handful of subscription services that Apple does offer are optional and don't impact the functionality of my Mac. They aren't monetizing me behind my back like Microsoft does with Windows, steal my personal details like Google when using their "free" services, or expect a premium for features already baked into their silicon like Intel demands.

I left behind the Windows world back in 2005, when I switched over to OS X. I'm looking forward to getting my first Apple Silicon Mac, so I can leave x86 behind, as well. I maintain an interest in the PC industry, partly out of habit, and partly out of morbid curiosity at the chaos. However, other than missing out on a handful of Windows-only games, I'm fine with never having to use their products.
 

Eric

Mama's lil stinker
Posts
11,520
Reaction score
22,239
Location
California
Instagram
Main Camera
Sony
This model has been around for years to some degree, I have built out several server environments in Azure (MS) and always have to factor in the cost of CPU/Memory/Utilization and downtime as part of the overall cost, typically using their calculator for more accurate estimates, AWS is this way as well.

It's the world we're moving into with cloud computing and Infrastructure as a Service (IaaS) but it also has its benefits, at the drop of a hat you can boost it to what you want with CPU, memory and disk space, you can also save during off peak times by lowering the power consumption or shutting it down. I can't speak to how this will be for average home users but for corporations it's completely shifted the landscape in just a few years.
 

Runs For Fun

Masochist
Site Donor
Posts
2,057
Reaction score
3,034
Location
Ohio
This model has been around for years to some degree, I have built out several server environments in Azure (MS) and always have to factor in the cost of CPU/Memory/Utilization and downtime as part of the overall cost, typically using their calculator for more accurate estimates, AWS is this way as well.

It's the world we're moving into with cloud computing and Infrastructure as a Service (IaaS) but it also has its benefits, at the drop of a hat you can boost it to what you want with CPU, memory and disk space, you can also save during off peak times by lowering the power consumption or shutting it down. I can't speak to how this will be for average home users but for corporations it's completely shifted the landscape in just a few years.
Fair. I could see that being useful in an enterprise environment. This just seems like a scummy money grab for home use.
 
U

User.45

Guest
Fair. I could see that being useful in an enterprise environment. This just seems like a scummy money grab for home use.
I might be totally wrong here, but my impression has been that the past 5-10 years, Intel's business model very heavily relied on them just being the industry leaders. Usually this attitude translates to becoming not-the-industry-leader in short time.
 

Runs For Fun

Masochist
Site Donor
Posts
2,057
Reaction score
3,034
Location
Ohio
I might be totally wrong here, but my impression has been that the past 5-10 years, Intel's business model very heavily relied on them just being the industry leaders. Usually this attitude translates to becoming not-the-industry-leader in short time.
That would be correct. Though AMD has definitely put the pressure on them since they released their Ryzen CPUs.
 

DT

I am so Smart! S-M-R-T!
Posts
6,405
Reaction score
10,455
Location
Moe's
Main Camera
iPhone
When we (as in the my acquiring company) were very engaged with Sun, they had a model for pay-on-demand systems architecture. Like actual hardware allocation on the fly.
 

throAU

Site Champ
Posts
266
Reaction score
279
Location
Perth, Western Australia
Oh sure, please can i pay for hardware that has been deliberately crippled so i can pay more for it in the future.

I just can’t wait for the next Raspberry Pi processor. the Pi400 is ALMOST comfortable as a generic light use web browsing/media playback machine, something like a Pi500 i can see being absolutely huge for the home market who just want the basics.

Can get by with a pi400 right now, but it does seem sluggish compared to modern platforms if you’re running full fat linux on it instead of Raspbian.

edit:
i know that’s not directly related to intel’s shenanigans, but the sooner regular consumers can have viable alternatives to x86 for little money the better imho.
 

mr_roboto

Site Champ
Posts
307
Reaction score
511
Oh sure, please can i pay for hardware that has been deliberately crippled so i can pay more for it in the future.

I just can’t wait for the next Raspberry Pi processor. the Pi400 is ALMOST comfortable as a generic light use web browsing/media playback machine, something like a Pi500 i can see being absolutely huge for the home market who just want the basics.
If you read the article, this doesn't sound like something they're going to do on the desktop. It mentions Xeon, and if you've ever shopped for high-spec Xeons and paid close attention to the technical specs and datasheets, it all makes sense.

To expand on that, it's reasonably obvious that Intel frequently designs one Xeon die which ships as potentially dozens of different SKUs. Each has its own unique combination of features fused on/off, cores enabled or disabled, and so on. Intel mostly uses this to charge deep-pocket low-volume customers lots more money. As an example, if you want one CPU to address 4TB of RAM, you'll pay a lot more for the CPU since Intel only enables that fuse bit on very high end models. They all technically have it, but nobody outside the extremely rare handful of customers already prepared to pay hilarious sums for 4TB of server-grade DRAM has to pay extra.

So that's what's been happening for many generations of Xeons - tons of fuse bits, lots of market segmentation. This sounds like Intel just wants to transition that sort of thing over to field-upgradable unlock codes. The customers for these kinds of chips probably aren't going to pay any more or less than before, and they get the ability to pay the difference to upgrade later on in the field with no more than a reboot's worth of downtime. Everyone's likely to see that as a net win.
 

throAU

Site Champ
Posts
266
Reaction score
279
Location
Perth, Western Australia
So that's what's been happening for many generations of Xeons - tons of fuse bits, lots of market segmentation.
Yeah and that's crap too.

They literally already made the processor in its entirety and still make money on it if you do not pay more to un-cripple the arbitrary bits they cut off.

Right now intel need to be figuring out how to compete, not figuring out how to nickel and dime the customers they still have.
 

mr_roboto

Site Champ
Posts
307
Reaction score
511
I agree that Intel has a problem with nickle-and-diming their customers. But I have seen a reasonably convincing argument that sometimes the fuse bit game can be a net benefit to customers.

It's based on the fact that tapeouts and validation are both quite expensive. For any given chip tapeout, architects spend lots of time up front (lots of it before designers even begin work) trying to chisel out the feature list which they think will appeal to as many customers as possible. They need to amortize projected non-recurring engineering and tooling costs across as many units as possible. This is why M1 is both a low end Mac chip and a high end iPad chip, and A14 was an iPhone and low-end iPad chip.

Often a handful of potential customers want esoteric features almost nobody else does. Architects might want to address those requirements by adding them into a broadly similar chip proposal with a wider built-in market, but if the price the special-features customers pay has to be the same as everyone else pays, the economics don't work out.

That's where fuse bits come in. They let the manufacturer charge a higher price for limited-appeal features, and that's what allows them to exist at all. More people are able to buy chips which meet their needs. Everyone's relatively happy, except for some of the people buying the expensive version who haven't internalized that the alternative isn't actually as good as being "cheated".

I don't want to imply that I think this is always a good practice, just saying it's not 100% bad either. You can certainly find lots of places where Intel has used fuse bits in an unreasonable and predatory way, I won't argue with that.
 

AG_PhamD

Elite Member
Posts
1,060
Reaction score
987
I don’t know anything about processor architecture but generally speaking this seems to be how a lot of technology is is heading- software locked/unlocked features.

For quite some time the navigation in many vehicles was just a very expensive software unlock. BMW had CarPlay as a subscription (up until they came to their senses in 2019). Teslas autopilot programs of course are a software unlock. Their short range “$35,000” Model 3 which no longer exists had was the same battery as the standard range, just software limited. I believe there is/was an option to unlock the full capacity after the fact. One of the most ridiculous examples is the Mercedes EQS’s rear wheel steering as a subscription- $575/year, at least in Germany. All the hardware is in every car, but you have to pay to enable it. I’m not sure if this is still the plan for Mercedes, but it’s incredibly stupid.
 

Pumbaa

Verified Warthog
Posts
2,564
Reaction score
4,220
Location
Kingdom of Sweden
not really. they literally made the chip already. the costs are sunk. if no one pays to unlock they’re still going to make money presumably.
What about the next chips? Will they add limited-appeal features to them if they can’t charge for them to cover development costs and maintain a healthy profit?
 

mr_roboto

Site Champ
Posts
307
Reaction score
511
not really. they literally made the chip already. the costs are sunk. if no one pays to unlock they’re still going to make money presumably.
You're looking at it from the wrong end of things. If Feature X seems expensive to design and validate, the expected increase in sales is small, and architects know they can't charge extra for it, guess what happens? Feature X gets cut before designers even start work. You can simply choose not to invest in things you know won't pay for themselves.

No matter how much it grinds your gears that dark silicon could theoretically be used, it's part of how the chip business works. Most organizations designing complex ASICs do it to at least some extent.
 

AG_PhamD

Elite Member
Posts
1,060
Reaction score
987
doesn’t make it right, and nobody should be accepting this

I absolutely agree.

So long as people agree to these stupid subscription models companies will continue to utilize them.

The other day I noticed Adobe DC requires a subscription to ROTATE a PDF. That is just insane to me.
 

Eric

Mama's lil stinker
Posts
11,520
Reaction score
22,239
Location
California
Instagram
Main Camera
Sony
I absolutely agree.

So long as people agree to these stupid subscription models companies will continue to utilize them.

The other day I noticed Adobe DC requires a subscription to ROTATE a PDF. That is just insane to me.
Almost as bad as Tesla wanting $200 a month for the ability to change lanes with your blinker on, but I digress. :mrgreen:

I'll say I do love my Adobe PS suite subscription, worth every penny to me but I use it extensively. For a casual user it's too much IMO.
 

AG_PhamD

Elite Member
Posts
1,060
Reaction score
987
Almost as bad as Tesla wanting $200 a month for the ability to change lanes with your blinker on, but I digress. :mrgreen:

I'll say I do love my Adobe PS suite subscription, worth every penny to me but I use it extensively. For a casual user it's too much IMO.

I have DC Pro through my work and preview handles most of what I need for personal use. But if Adobe thinks I would pay $13 a month for DC Basic or whatever they call it, which can do pretty much everything Preview can do, they’re dreaming.

I’d love for apple to make a true competing feature of Adobe Scan + Document Cloud apps.
 
Top Bottom
1 2