Zen date announced

What's wrong with that? Can't blame Intel for trying to make a profit. There's nothing wrong with that. They're a business, their job is to make money for the company, not to unnecessarily put 100% of their revenue into R&D that isn't required for them to stay dominant.

It doesn't make sense for Intel to poor money into desktop R&D when there is little competition and no incentive for people to upgrade. It makes sense for intel to put money into things like project alloy because a success on one of those new projects means a new revenue stream for intel.

This is true...

Because Intel doesn't

Some people, myself included, believe in progress for process' sake. Being a market leader is good/fine/whatever from a business POV but terrible in regards to innovation.

I believe the last 5 years CPU tech has severely stagnated. Due to the lack of competition. Intel doesn't have to try so their customers should pay them more year after year not to?

You need Intel to try just as much as the next person. New tech that performs better should drive the cost down of older tech. The bigger the performance gap the greater the devaluation of the previous generation.
When one generation barely inches out the last a manufacturer can just increase price without cutting the costs of last generation.
Incrementally consumer costs go up generation after next. Something truly revolutionary has to come along to shake this up. Intel has no need to produce this.

I believe this has also stagnated software development. Why invest more time to make software multithreaded if Intel is still working with roughly the same amount of cores?
Adobe Premier Pro is a great example of this. Maxed out around 8cores.

I suppose I'm just an old fart wishing for the"old days" when tech could improve performance exponentially with each generation while cost only incremented linearly.

I don't think that if Intel was making even 10% IPC jumps generation over generation that many more people would be upgrading from cpu's like the 2700k or 3770k. For gaming, only really a Titan X(P) or Gtx 1080 is really going to be held back by an overclocked 2700k or 3770k. For most people, the cpu is not the limiting factor in their gaming build. Which is why for gamers, and desktop users, there is no need to upgrade. I mean for people that just use Facebook or Twitter, they need no more than an I3 or Pentium. It's only professionals that need more horse power than an I7, and above the I7, Intel hasn't stopped making massive performance jumps generation over generation.

I mean, look at the Xeon 2680, for every generation this skew has been around, has added two more cores over the previous generation.

The price has stayed relatively close generation over generation, but the performance has actually continued to improve as you move up in generations. I mean if the trend continues as it has, with Skylake-E we will be at a point where core count has doubled in 5 generations, which is nothing to scoff at by any means. I mean over 5 generations, this skew has went from a 1100-1150 score in cinebench r15 to just over 2000.

Intel has improved a lot more on the Xeon side when compared to the mainstream parts.

Mainstream chips have remained Quad core, IPC has gone up and the prices did too.
Also IPC gains highly depend on what you're doing, for gaming IPC hasn't changed drastically since sandy bridge days, but for other tasks IPC has gone up significantly.

Which is a very linear improvement.

Feels like we're using the same points to argue both sides. :)

The tech industry is just becoming more like the auto industry and to me that's sad. Recent CPU progress hasn't felt like change its felt line inertia. Contrast that with the speed cell phone have evolved or flash storage. Things are getting better but just barely.

The laws of physics are a pain sometimes. They're reaching the point where transistors are only atoms wide. Hard to make monumental improvements when you're that small. Huge amounts of R&D go into maintaining evolutionary improvements at this point until a totally different technology comes around.

Which is what I'm talking about.

Sadly, quantum mechanic understandings and discoveries are coming about much more slowly than many had hoped for. Lots of great things are happening there but the sheer amount of energy required to perform activities in that field make them non-starters until we get smarter. We're certainly improving at a fast rate (compared to Newtonian science rates) but it's still tons slower than what we got used to when silicon became a standard with chip manufacturing.

1 Like

im guessing your talking about quantum computing i know that alot better then what we have now but why and whats the differences?

A PhD, which I don't have, is required to adequately answer that question. But a dumbed down version of an answer:

Very different kinds of algorithms apply to quantum processors. It's kinda like comparing algorithms for an analog computer to algorithms for a digital computer. Or like comparing algorithms for a CPU to algorithms for a GPGPU. It's just a different way to look at it. But the two areas it really matters most importantly are:
1) In theory, a quantum processor can perform a single calculation with multiple possible inputs simultaneously (Schrödinger's cat applied to algorithms).
2) Faster-than-light data transfer speeds (leveraging entangled pairs, which currently are consumables that take extraordinarily large amounts of energy to create).

There's a lot more to it than this but you can probably imagine how this can change a lot of things as these concepts, technology, and algorithms mature.

2 Likes

ah cheers for the info ive been hearing about them for a while but never seen much about what they are :P