Intel Arc: The meme card of 2022

Drivers can be fixed though, that hardware flaw seems a bit more worrying.

Sure do hope they push through, having it actually cancelled would really suck for us, customers.


I kind of argue the opposite but thats for another thread
With or without arc, the end customer will benefit.

Why do you think customers will benefit from Arc failing?


To be completely honest, Arc coming out renders a lot of machines as ewaste and unusable because of the ReBar requirement. At that, BECAUSE of the rebar requirement, the driver and card have to have more access to the system, much like an iGPU or APU has. As near as I can tell, ARC itself tries to make your system think it has an iGPU in it, and thats why rebar is needed. The card has a lot of built in systems for direct access storage like AMD STORMi, they have a pretty radical ray tracing engine that… while cool at least, and I have seen die shot leak comparisons… Just the underlying structure of the card makes it kind of useless to me.

I can’t think of another GPU that has come to market that legitimately could only play league of legends on launch, but ARC seems to nail that. I could even run GRAVIS targets on a 1030ti when that launched and THAT thing was HORRIBLE until they got the voltage corrected. Intel can’t even keep freesync or vsync from artifacting, let alone make their intel quiksync, something that had been around since GEN got a new inject of people on the HD4000 series.

But, or ARC to fail would mean that either intel will

A) Stop making bad APU’s
B) Go harder on GPU design
C) Stop making shitty GPU drivers
D) Stop making GPU’s alltogether and focus harder on the APU game if they want GPU’s at all

All of these are good things. I think ARC will be cancelled because market wise its a massive hole for them to engage with the consumer market.

Nvidia is already on their way out of the consumer market, will be gone by 5 or 6 series if their projections stay solid, and its literally for the same reason that intel can’t even get in to the market. Theres too many things going on on the desktop side that they could literally just replace with the server market and make even more money.

Edit: The reason AMD can hold on better is they have FOSS’d everything. So if some crazy person finds a broken thing, such as me testing AMDGPOU on an R7 370 back in the day (I am not unconvinced my data was used for the valve APO driver), and because much of their systems have been taken by other manufacturers, editted, sent back, then AMD took those edits and re-init’d them, they are on a steady building course for success. They took note of market changes and adapted, where others are oddly 90’s…,… still

If you wanty a past example of this, IBM ditched the consumer market even just to satay open as a business. They lost a lot of money on the AIM excercise, but GAINED access to NXP. NXP took their desktop designs and manufactured them for whoever wanted them, but when was the last time you had an NXP desktop? Never.

This is the same movement on a different market.

To be honest I can’t work out the purpose of this thread. Are you actually announcing that ARC is being discontinued already (I can’t find any source backing this up), or are you just gloating that the first generation of a low end product isn’t as good as more expensive products from companies that have been in this business for 30 years?


As if they aren’t doing that already?

1 Like

The funny thing is that’s way more true of Intel than it is of AMD. Intel’s Linux driver is far more open and integrated into the Linux kernel than AMD’s is.


A combinanation of both. Reading through op it’s a rumor/reaction gauging the community’s reaction.

Overall seems that general consensus is that we don’t want intel to pull out of arc

Original source

Rumors mostly, but with it circulating around the press, it seems to have lit a fire throughout the company

1 Like

Since when did we start taking a word Moore’s Law is Dead said seriously?

Intel are not going to pull out of Arc. The entire reason Arc exists is because Nvidia have a monopoly on datacentre GPU compute and are now producing entire systems of high performance ARM CPUs and Tesla GPUs. Intel have been sleeping at the wheel for years now, and Nvidia are making roads to stop eating Intel’s lunch and move onto their breakfast and dinner too. Intel are not going to go back on this just because their lowest end first generation gaming GPU has teething problems.


There was the Asus Zenfone 2 and I think some Orange-branded Alcatel or something that ran Intel Atom and an x86 version of Android. In all actuality, both were good phones. I wish they weren’t axed.

Wrong. They only FOSS’ed their GPU drivers, but they still require blobs that operate with mesa. As for other things like GPU partitioning, AMD claims they use open standards like SR-IOV, which is open, but when you get deeper into it, the only people who have access to documentation and drivers are big enterprise customers.

Moral of the story: don’t ever fanboy, none of the RGB companies are your friend.

Also, Intel drivers are probably the best for Linux, funnily enough, just that their hardware sucks, because… duh, iGPUs. I really hope Intel will come back later with consumer dGPUs again. It would make a ton more sense to have integrated solutions though, like the Hades Canyon NUCs that had Intel 8th gen CPUs and AMD Vega GPUs with HBM on the same package. If they can pull off an integrated solution like that that can sip power, although that would be a bit hard to believe given their past few CPU generations, they would make a good buck on those.


GNexus covered about it in their news lineup

I will grant you this, I favor AMD for my work since I have less issues with their hardware than intel.

1 Like

Well no shit, they actually make the hardware we’re talking about and sell it outside of China…

1 Like

No I mean straight up I have had i7’s shit the bed where a 955BE kept going and just didn’t care all that much. It set my expectations to be higher.

Eh I’ve used both when they’ve been compelling. I had an i7 3960X from 2011 until 2019, and then got a Ryzen 9 3950X that I’ve had ever since. They’ve both had aspects I’ve liked and didn’t like. I didn’t like that in 2019 for the same price as the i7 and ROG IV Extreme I took a platform downgrade but the CPU itself is fine.

IMO with what I saw ARC do, what I have seen GEN team GPU’s do with beta drivers, and thinking of larraby (literally put a xeon on a PCB and said “Look a GPU!”) yeah both are compelling for reasons in their own time… This just looks… bad.

Not really. From what I’ve seen it’s a decent low end GPU with a compelling approach to machine learning and ray tracing, and shit drivers that should have been held back and finished for a couple of months. I think it would have been perfectly acceptable to ship it with the bare bones functional drivers (the bit that currently works), and hold off on the additional features until they’re ready. I don’t care about the ReBAR requirement because that’s the way the industry is going. You can’t buy a GPU that doesn’t use ReBAR now. If you’re using a PCIe 2 CPU then just buy something else, otherwise it’s not really an issue.

It’s worth remembering that you’re basing all of this on the lowest of the low end GTX 1650 equivalent.

Mmmm not really. If I was basing only off the A380 then I’d say so. Its BOTH the 380 and the 770.

Also, not even thinking PCIE 2. I wouldn’t even bother with PPC machines where you can tweak the PCIe controller because its a big ass FPGA. I mean I couldn’t get one and put it in my X79 desktop today and have it work. At all. And up until 6K, intel didn’t make rebar the firm.

Also haven’t heard about ReBar being REQUIRED for the card to work on other systems. I used a 660)XT in a Core 2 machine at the start of the year, no requirements there.

Yeah maybe I’m expecting more, but like if its a l;aunch wouldn’t anyone? Just kinda shows my point about ewaste and stupid investments. Waste of the worlds silicon supply if the init the battlemage chips unless they can make it work like an ACTUAL GPU and not mask as a iGP, I’ll give a shit.

I restrict my judgement to products that exist and have been tested.

I’m not seeing the ewaste argument at all. There are many other things you can get in place of this if you are on such an ancient system. I also don’t really care if you give a shit or not. Preliminary testing from GN shows disproportionate performance increases as resolution increases. If this is the result of the obnoxious ReBAR requirement and scales to the high end, I’m not going to cry over it. The irony is that this seems to be what AMD promised with SAM and didn’t really deliver.

… No. If you have a system that does not support this feature, the card still functions, just not as well as it should have. This system will be so old, that by the time rebar is mandatory for high end gaming your old 8 core CPU will be obsolete for that in any case.

If you are a medium settings pleb, the feature wont matter anyway.

Congrats on the clickbait title making me click I guess.