X99/Haswell-E is a scam, and here's why

Sorry, I want to rant for a moment.

 

  • Haswell-E desktop CPU's

Here's the problem: no one needs them. Gamers do not need them, most professionals do not need them, and your average consumer certainly doesn't need them. Let's pretend that you're the 0.01% of people who do things that are bottlenecked by CPU:

$400 gets you the entry level 6-core part, and $1000 gets you the 8-core part.

AMD's Opteron offerings in the $400-$500 range will blow away all Haswell-E desktop CPU's. I can pick up a 16-core, 2.7GHz Opteron right now for $425. (bit.ly/VZQQpg) That's a no brainer.

  • DDR4

Literally useless to anyone who isn't managing a datacenter. All you get are incremental clock speed increases and lower voltages. Unless you're running an APU/GPU-on-CPU-die, RAM speed (when I say speed I mean overall speed, not the clock rate) has been completely irrelevant for a long time now. There hasn't been a real world improvement from RAM generations since DDR2.

  • X99

Really nothing special. They're expensive boards with features that you'll never use and will most likely never get market share. (Read: Thunderbolt 2)

 

We're plateauing in the HEDT sector. Our desktops are powerful enough to do anything we want to do with them, and R&D for new, slightly better stuff is too expensive to justify for the tiny market share that would utilize a little extra power. It was a fun ride, but it's over.

The thing to take away from this is don't be a mindless consumer. That seems to be the only thing hardware geeks are nowadays. Surfing the internet and watching cat videos aren't going to be any better/faster with $2000 desktops, guys. There are other shiny toys that are much better buys.

If you want to be a tech enthusiast, channel your energy to something that matters. Get into Linux, learn a programming language, etc.

I'm sure I sound like a hater or whatever, but truth be told, I'm actually in the tiny marketshare that actually do benefit from this stuff. I encode videos with x264 on a weekly basis, so I can use all of the CPU power I can get. However, I'm not stupid enough to spend $1800 so my encodes finish 10% faster. I really, really wanted to like X99/Haswell-E. I was hype for this platform for so long, with only my hopes and dreams being crushed and ruined.

I hope the platform flops so Intel will at least price their next HEDT offerings at a more plausible level. Those are pipedreams though, as I think Intel knows that the only people buying this stuff anymore are sheep who will always buy the latest and greatest regardless of price or personal benefit so they are probably are better off just milking that cow. Oh well, maybe in a few years people will learn.

That is all.

Personally, I am excited to have more than 16 pci-e lanes (How is AMD coming along there?). Was this written by AMD, or just fanboys?  There are a lot of upgraded things on this new chip, and its the first one I have been excited about in several years.  The third and fourth i series refresh weren't that big, and not really justifiable to upgrade to in my opinion, I am still using sandy bridge.  BUT, the X99 offers a LOT to EVERYONE, and as I said, to me, the extra PCI-E lanes are going to be HUGE.

You claim that X99 offers "a lot to everyone", but didn't even give one example of that being true.

More than 16 PCI-E lanes, that was ONE example....

 

I agree, the CPU itself, a few extra threads, etc. is Blah Blah Blah...no HUGE improvement in and of itself, but the addition of more PCI-E lanes offers more expandability, wont drop the bandwidth of the expensive GPU just because you added in a sound processor, h.264 encoder, raid card, extra NIC, or whatever, or even a second GPU, it HALVES the bandwidth available.  The X99 allows up to 40 lanes, allowing 2 full x16 slots, plus four more lanes for other things...It also has built in native USB 3 controllers, and more SATA Controllers (10 rather than 2)  (meaning you can run more drives without a secondary Marvell controller, or run a larger raid array without the secondary controller which could cause conflicts).  At the moment, DDR4 isn't much of an improvement, but the higher density and the way they communicate with each other has the POTENTIAL to make a huge difference once it starts being utilized by software developers.  This is the first gen of chipsets and CPUs by Intel in about 4 years that WILL make a difference in real world application, unlike the last 3 generations (which I am in total agreement with you about).

I think the X99 platform is too spendy for most people ... maybe after prices drop.

Ohh the ignorance is strong with this one. Do you know what the term efficiency is? I don't think you have any grasp of how it apply's to X99. X99 is a platform for people NEED lots of power. How would that AMD opteron compare to a 5960X thats been overclocked to 4.3ghz (which would be a bad overclock)? The Intel architecture is much faster compared to the locked at 2.7ghz opteron. DDR4 memory is huge for people that want to be efficient. The higher density's allow for more work to be done faster (and thus paying less for workers and allowing a worker to be more productive) and the faster speeds make rendering much faster. You realize there are lots of people that edit and render 4k video? SURPRISE there's a crap ton. Movie makers, people that work with 3D software, scientific researchers etc; all people that will greatly benefit from X99. I don't know what h264 encoding you do, but obviously it's not to the scale that a professional does. I don't know what system you have, but your "10%" number is pretty off from what I see. 2 extra cores and 4 extra threads over the old X79 stuff plus a 5% faster architecture is going to be a lot more then 10% faster. Also, you realize these are not gamer targeted chips right? The strict gamer doesn't need more then a quad core and 8gb's of ram. A gamer that does make videos or do work related  things with his system is going to benefit from the extra cores.

Oh and the chipset improvements over X79 are big. Have you seen the amount of USB 3.0 some of these boards have? It's insane. There are professionals who do legitimately need every one of those ports. There are professionals who will use extensive raid arrays on the chipset who will legitimately need that much sata. You can argue that they should use a raid card and you'd be right, but some don't want to shell out another $1000 for a card that is going to use up valuable PCIE space. Sata express and M.2 are going to make loading programs and large files much, much faster. You say you do h.264 encoding. Well if your taking files form a hard drive think of it loading 10x faster. Now scale that to 5-10gb files. The time saved is a long term noticable difference. Say you save 2min every day between loading files and booting faster. That's probably the low end of the equation to be honest but it works for my example. 2min every day in a 5 day work week scales to 10min saved every week. Now scale that to a month and you save 40min. Scale it to a year and you get 480min saved. Do you know how much can be done in 8 hours of work? That is way more profit then the beginning cost of the more expensive M.2 or Sata express ssd. We can then go ahead and scale it to 2-3 years if not more. Professionals do run their machines until they have to upgrade. That is on the low end of the calculation considering most professional in the editing or 3D work field work 6-7 days a week.                 

Except... The equivalent 8-core Xeon is ~$2000.

You seem to be claiming that because it's very expensive and it has a lot of features most people won't use/need, it's a scam.

I'd agree that it's generally not worth it for your 'average' user, but that doesn't make it a scam.

Thanks for that, that added SO MUCH to what I was trying to say.  :)

That really depends on how you define your "average" user as well.  If you are referring to those who use a PC for web browsing, a word processor occasionally, and music, then yes, this new chipset doesn't really offer much, and would be perfectly fine with AMD offerings because they wouldn't notice much difference between AMD and Intel.  The problem is, the "average" user is starting to become more than that, and is doing more and more on their computers.  There are more and more people doing 3D rendering, video encoding and streaming, using several devices both internal cards and secondary processing units, and external devices, all needing the extra available bandwidth.  As I stated earlier, just the addition of more PCI-E lanes is a massive improvement in available bandwidth in the system, and will very possibly be used by your "average user" (depending on how you define the more and more demanding average user). 

I think the usage of the word *Scam* by OP, entails too many negative connotations. What I think OP is trying to say is that "Intel touting 8-cores as a revolution is overhyped, and not benefiting to the average consumer." But like everyone has stated, Intel is marketing this (and rightfully  IMO) to video/3d encoders who need that extra  speed.

But then again Intel's marketing team have always been on point. Scam however wasn't the right word imho. But I think OP made excellent points, I'm currently trying to learn JAVA and Linux.

People have money, let them spend it how they see fit. Corporations/businesses have to market, make money.

I think Intel's marketing team has always been something else. I have to give it to them, how many people upgraded through all four generations of the i7? At the same time, it's not a scam, there were some improvements (though marginally debatable). This time, even Wendell agreed, that this x99 platform is the next 5 year platform.  But it is still expensive and I feel the common frustration in the air.

True, I was using "average" user to refer to gamers who don't buy super high-end and then just people who do general web browsing and such.

I suppose the 40 pci-e lanes really helps with four-way SLI/cfx... which if you're spending $1k on a CPU you can probably afford four high end graphics cards.

How many pci-e lanes did X79 have?

Thanks for your detailed response. As for Haswell-E versus Opteron, all I know is that for my case (x264 encoding), the Opteron is a better buy.

I'm oversimplifying your argument here, but you are basically saying "There are people out there who can utilize this!"

I'm aware. My argument is that 99.99% of people WON'T, and even the majority of people who will buy into X99 will not utilize it either. Most people, even in the computer enthusiast world, don't need >2 Graphics cards, an extra NIC/[insert high speed bus here], etc. That's what I am trying to convey.

 

This is offtopic but if you want to get into the intricacies of h.264, I use settings around --preset placebo plus a myriad of avisynth filters on 1080p content. Also, unfortunately, there aren't lots of people creating and editing 4K video. I wish there were, but the equipment to capture 4K levels of detail (not just 4K resolution) is extremely expensive. I have yet to see any 4K content besides for test footage that actually utilizes the resolution with the exception of TimeScapes.

Anyway, you're right about there being a real-world advantage for us video editors/encoders. If you're in that category, like I said there are better high-end CPU's at that price point. In my case, video rendering is generally an overnight/off hours thing so whether it completes at 4am or 4:08am it doesn't matter to me.

 

As for the title, it's definitely more nefarious than I am trying to convey. It's just an attention grabber =P

Intel really outdid themselves with Nehalem. IMO, Nehalem probably is "the 10 year platform" more than Haswell-E is "the next 5 year platform".

The X79 CLAIMED to have 40....BUT they weren't actually useable because the current CPUs for that chipset didn't support it, they still limited it to 16, the X99 chipset and 2011-3 CPUS DO support it.  I have found that its not that unusual for multiple GPUs AND a hardware h.264 encoder (and in my case a sound processor), and the available bandwidth in such a setup was a bottleneck that the new chipsets and CPUs will eliminate.

well you can't assume 99.9% of people won't use this technology to the fullest cause you do not know everybody. and you cannot speak for everyone. that goes for your argument about Thunderbolt, DDR4, and the existence of Haswell-Extreme Chips and your comment that not everyone needs more than 2 GPUs (Dude there are people that game in 4K out here, 1 Bloody GPU is not enough for 4K). everyone has different uses for different cases. also i hope you are aware the Opterons are Server parts? and their motherboards are meant for servers while X99 and X79 are built for the desktop? however there are server grade motherboards. on X79 and there will be X99 Server boards coming too. also there is a reason Intel has dominance in the server space and in the workstation space, they get the bloody job done in a short amount of time and the get the job done well. AMD has proven they don't care about the Enthusiast Market, with the lack of True Performance CPU in years, the Re-release of the exact same CPU's with different clocks-speeds and different TDP's and the lack of a True Workstation platform.

It was really the nehalem xeons, though. I am still running some dual socket high clock speed nehalems that cost a pretty peny back in the day (for server stuff). They run hot enough that it really makes me cringe -- the cost to move to 2011 E5-2630 or 2540 would even out in probably 1 year taking into account the electricity cost savings. We got some 2620 E5 Xeons (6 core!) for about $350 each you'll see in an upcoming video. Man, those are sweet. Even in the context of Haswell-E they're sweet, except for the gimped clock speed. Intel has been super careful to not take anything close to 3ghz unless you're paying a premium. 12 core/24 thread dual socket on those 'super deals' is probably a better deal for the same $ as haswell E, but it's an older platform and haswell-E's strength is in low-threadcount optimizations. e.g. it screams when you're only using a few cores. It's not like that on the first or second gen xeons (e5 XXX V2 you see). It'll be interesting to see what happens with the V3 Xeons. Probably just two more cores and maybe a slight clock bump for the same money. Meanwhile it's unlikely intel will discount the old v2 chips BUT there will be vendors looking to liquidate. That's what happened with our gen 1 2011 Xeons. An oem was liquidating and we got in early. 

 

Linus had that video on base clock overclocking his fancy 12 core xeon. Could you maybe do something for those 2620's? I don't really know that much about xeon base clock shenanigans xD    

I will 1st state i have not read replies and this is meant as my response to this.

Spending thousands bucks on super-hyper-omega etc hardware is just throwing money into the mud from personal perspective that is true. Since the price will come down in next 2-3 years to 15-20% of what it is today, and in next ones even more there is no point into buying the newest if you don't have pockets deep enough to do it, neither there is point of building future proof pc's. Always new stuff are filled with bugs, mistakes etc.. and no1 knows about them yet - its like a minefield.

Now the x99 Haswell-E is not a scam... obviously hard to understand why would you use haswell platform to create enthusiast grade cpu's, when haswell was designed to be a mobile arch cpu. People pay for the goods as much as they want to pay for them. Yes you can get amd with 12 or more cores cheaper, but we all know those cores are weaker... yes software like 3dsmax will take 'em all and more = better, but still when you buy a enthusiast grade pc most of times its not for games, its not for movies... but for pure power they can use in calculations like boinc, bitcoin, 3d rendering etc you name it. If it uses less power and is more powerful than last one then its worth money since it can do more, faster, 2xcheaper. Yes some costs will take a while to return itself (like a year) but its a good way. As Xeons are good cpu's they are not better than i7 or i5 series in GFlops in real life computation; most of xeons were build for databases, websites, server based services. Previous gen. of xeons had more specifications X E and few others which were designated to specific purpose.

Haswell-E certainly is not a scam. More, Faster, 2xCheaper (why 2x? -> Faster computations, + less used energy = 2 reasons cheaper than previous gen) We should thank to people who are going to buy it; since it will drive down the costs for us in future.