Another BIG Intel-specific FUBAR: "SPOILER" speculative memory subsystem vulnerability

CISC vs RISC is an interesting discussion and maybe subject for another thread.

1 Like

Sure! All the responses and talk we’re having are really interesting. Also I’m thinking about making it my thesis for my graduation in computer science and automation so I’m looking into how to make it work.

2 Likes

You did ask so…

ACHKUALLY! Office 2019 is the latest.

Sorry I could not resist.

2 Likes

All the windows tablets dried up once intel stopped doing contra deals to manfacturers for the atom cpu’s

they basically only had to pay for each one sold rather than pay upfront for all the cpu’s.

The problem is thanks to all those clever “Intel Inside” stickers all over every single box Intel has become synonymous with the PC. It also didn’t help that AMD hasn’t been worth a shit until VERY recently. Part of it is due to Intel shitting all over them in a myriad of ways but the common lay person can’t be bothered. They just see Intel and go oh thats Intel it must be good and Intel are the bomb even they are getting hosed six ways from Sunday. It’s going to take time for AMD to make inroads, probably at least a decade… but they are doing it.

I guess we all have a neckbeard deep inside that we try to keep chained. I can’t blame you for letting him out this time.

It made me laugh too haha

1 Like

[quote=“MetalizeYourBrain, post:22, topic:139735, full:true”]

Well, IIRC, Apple is rumored to be switching to ARM next year. If that turns out to be true, I guess we’ll find out if their “ship an emulator in the OS” approach is still viable.

It has been rumored for quite a while I think. Even before the launch of the iPhone X. Than the came out with the A12, impressive Cinebench scores and people blew up the possibility of them switching to ARM. I don’t think it’s going to happen until they unify the iOS and MacOS kernel in my opinion. Also making that switch “today” would mean migrate the iPads Pro to be fully fledged PCs or upset users. So maybe they’re still working on how to still differenciate the product stack too, making an SoC for computers and one for phones and tablets. It’s pretty hard to see that far into the future, but surely it’s not something they’ll do soon in my opinion. What do you think?

1 Like

iOS and macOS (as well as watchOS and tvOS) already share a kernel. The higher-leave APIs are different, but I believe the really low-level stuff is all in a common codebase.

Someone generally reputable (Wall Street Journal? I don’t remember) said they’d switch in 2020 a year or two ago. I can’t decide if I wish they’d start the process “early” with this year’s Mac Pro, or keep putting it off. I kinda wish they’d do a generation or two of Ryzen-based systems simply because then they could offer ECC RAM across the board.

I’m still not even sure I think it’s a good idea. Being able to run Windows is worth something to people who aren’t 100% sure they want to use macOS, and switching to ARM probably will kill that option.

I wasn’t aware of this, so I said something that really doesen’t make sense regarding the kernel stuff.

I don’t know if they’ll switch to AMD. I’m sure they made a deal with Intel so they’re bound to buy a certain amount of CPUs per year. And if they didn’t I think it would cost more to get rid of the deal than “save money” with AMD.
Also I think AMD uses thicker wafers and substrates so their MacBooks would be thicker, no way they’ll do it haha

2020 looks still too early for me, if I had to guess based on what I think of the matter. Maybe I can see the MacBook (the one port thing) being the first “victim” of the switch.

Oh I have no doubt they made a deal with Intel, it’s just that the details aren’t public.

I would view thicker laptops as a feature, not a bug. OTOH, aside from cellphones, Apple has made it pretty clear that I’m not in their target demographic anymore WRT hardware (aside from the as-yet unrevealed new Mac Pro, seeing as how there essentially no “actionable” public info on it).

The 2020 timing does seem a bit off. The “problem” (to the degree that it is one) is Intel will probably (hopefully?) have their 10nm process sorted out by then, which means faster, more power-efficient CPUs, which means less of a reason to switch. Going to AMD/ARM about a year ago would’ve made sense (assuming all the tech would’ve been ready on Apple’s end), as does doing it in when Intel’s next big stumble comes along. But 2020 looks like it might be when we’ll find out if Intel just had a bad run or really has lost their prowess. Or maybe Apple knows something we don’t and 2020 will turn out to be the perfect time to leave x86 or at least Intel.

(Personally, I kinda hope that if they leave the x86 arch, they switch to something other than ARM… it’s not often that an industry leader changes something as fundamental as this, and ARM feels like “more of the same” to me, which means that they’d be missing an opportunity to do something revolutionary WRT CPUs. Dunno, maybe it’s just me.)

I can’t recall in what order they switched models in the past… I remember that trouble specifically with mobile PPC chips is why they switched to Intel, so it was probably laptops first for that transition. The 68k to PPC switch was so long ago that I don’t remember much at all WRT hardware (my family wasn’t in the market for a new computer at the time).

Everyone would, seeing also how their butterfly switches are performing and the terrible thermals their products have. But Apple has a design policy and I highly doubt they’ll just go back easly unless they find something that could overshadow the thickness increase (see how they stopped caring about phones thickness at one point and it’s no more a marketing point).

ARM is the dominating RISC architecture and I don’t see anything bad with them using it. Fortunately it’s not like ARM it’s making SoCs, they’re just selling core architectures so there’s room for competition. There’s Qualcomm, Apple, Samsung, Mediatek, Allwinner and others in the market which is really good in my opinion.

I think faster will not be a feature of new processors for a long time. I think they’ll try to solve the architectures flaws, they’re improving the scheduler (iirc. And goes hand in hand with most of the vulnerabilities found recently) and they’ll be more efficient. I agree with you that the 10nm CPUs from Intel will be the next step for sure by the end on 2019/early 2020.

1 Like

Yeah, there’s nothing wrong with it, there just isn’t (AFAIK, anyway) much that’s innovative about it.

Since advances in CPU tech big enough to single-handedly warrant changing archs are pretty rare, and since an arch not supported by a major OS is unlikely to get any traction outside of the embedded market (assuming it’s appropriate for embedded applications), and since CPU tech is so expensive to develop, for it to advance it’s important to take advantage of any opportunity to get ISA-level advances into the market.

I mean, say you have some idea for increasing security by treating pointers as distinct from unsigned integers (I know runtimes already do this, but AFAIK none of these manipulations are done with hardware acceleration). You can’t really tack that on to x86 or ARM because it’ll break their ISA… they wouldn’t be x86 or ARM anymore. So if you’ve come up with an idea for improvement in that area, the window on getting it widely adopted any time soon (and before the patents expire) has probably already closed. Assuming the rumored time frame is correct, I’d be shocked if Apple would change such a fundamental design decision as having a “pointer unit” in addition the ALU and FPU. There just isn’t time to get the hardware correctly designed and their software rewritten before this is supposedly supposed to ship. (And if there’s one area where the industry needs improvement at the moment, it’s security.)

Or to say it another way, and continuing with the security-themed hypothetical, my complaint is not that ARM has worse security than x86 (because AFAIK it doesn’t), it’s that we’ve ostensibly learned a lot about how computers can be attacked since ARM’s fundamental ISA was designed, and switching to ARM now blows an opportunity to make breaking hardware changes to how the ISA works to address these issues.

(I’m reasonably certain the same argument could be made WRT performance, too… Everyone’s trying to figure out how to get x86/ARM’s wheels to spin faster instead of figuring out how to fly.)

Clean slate approaches may have a better outcome, but the risk they come with makes them the financially questionable decision.
I agree that x86 needs to get replaced, AMD won’t be able to keep pushing cores per area for ever.

Which is why it’ll probably take something like an Apple pushing it for such an arch to have a chance.

Otherwise, a single design that can out-perform the equivalent x86/ARM CPU is probably already a tall order… but I’m not sure an ISA that can continue to do so, and with maybe 1% of Intel’s R&D budget (if they’re insanely lucky), is even possible. Yet it’s what we need for the industry to advance beyond design decisions made in the 70s & 80s.

This is why, while I might not mind Apple moving away from x86 (if the perf is there and they can get the transition tech right), I’m disappointed that they appear to be embracing ARM. It’s a missed opportunity for the entire industry, and one that probably won’t come around again for a long time.

Apple is running a wald garden.
That would be like Sony switching their console CPUs to OpenPOWER.

Eh, sorta? Backwards compatibility isn’t as expected on consoles, and you can’t hide lower performance (due to emulation) by saying the PS5 plays PS4 games as well as a “2014 PS4” (or whenever they first came out) because all the PS4s have the same CPU.

OTOH, consoles have a long enough life cycle that it might make sense to assume an arch can emulate the old one at native speed.

Maybe not forever but its going to take a decent bit of time to get coders to catch up with the core wars.

In fact, Intel “answer” looks much alike : you paid a fortune for our share holders CPUs, our engineers are dumb enough not to thoroughly test a speed-up solution, now your performance can be impacted by up to 33% but we don’t give a damn (and you’ll gleefully open your wallet for the next generation which will have even worse flaws.)

I’m far from representing a very large account, but in my littler corner of the world, I’ll do my best to make my next purchases go to AMD (that has been crystal clear about those flaws.) 'Cos these people only understand one thing : money (OUR money !)

1 Like