Have we reached a generational hardware leap?

I’m writing this in hopes of getting some more insight into the future of hardware, and what is currently available (maybe the numbers lie?). I’ve always been configuring potential builds, and lately, I’ve felt nothing but frustration over the hardware that is currently available. Not only is there no logic in the price to performance ratio of current CPUs, but they seem to be scaling power in very strange increments - very miniscule or insignificant ones, at that.

For example, I’ve been looking at AMDs FX-6300 hexa core. It looks to be a very solid CPU for a mid ATX build. Something I’d personally like to fit into an NZXT Beta EVO, throw on some fans for more cooling capabilities, overclock, and have some fun.

But it just doesn’t seem right, when looking at the numbers. According to CPUPassMark, the 6300 is about 25% faster than the NEHALEM, first generation, entry model i7 920, a chip that came out a few years back. That is a pretty substantial performance boost, but just doesn’t seem right at THIS TIME. I thought that the hexa core FX CPUs would have a little more jam. If the 6300 was around the 7000 passmark at its stock clock, I’d absolutely fall in love with it, and wouldn’t hesitate to build a system with the 6300 at the heart of it. I would like some longevity and CPU headroom for a system in general. That way, if I had to, the system would handle more powerful cards with ease.

Now you might be thinking “well, why not just go ahead and get one of the 8 core FXs?”. True, that is a solution. However, the only solid increase in performance would be the 8320, which would leave me with that, or the 8350 as my choices.

Which brings me back to what I said in the beginning; price to performance ratio. Logically, overall, if someone was going for an extreme system, the 8350 would be the best choice. Taking a look at its $200 pricetag and comparing it to the 3770K, it doesn’t really require much thought. The 3770K is what, $120 more than the 8350? That’s of course over 50% of the 8350s price. In this circumstance, you ask yourself; is it GIVING ME 50% or more PERFORMANCE? Well.. from what I’ve seen, it doesn’t even come close. The Intels are more optimized for Crysis, take that as an example. It only beats the AMD by a few frames per second. That’s it. Does that equate to 50% or more performance? Not at all, it doesn’t even scratch the surface.

I guess what I’m trying to say in the subject of price and performance, and in general, is that we no longer see incremental, noticeable, or any logical increases in performance like we used to. For some reason.. it’s as if we have encountered a very docile, stealthy anomaly in the world of computing hardware. This has started, I’d say, at about the end of 2011. The early to mid 2000s saw a natural progression in computer hardware. And never have I been so confused, lost, disappointed, and just plain uninterested in almost any piece of hardware that is on the market now. Especially when it comes to Central Processing Units. Honestly.. the only appealing thing, is something I’m not sure I even really need, or want right now. The time just doesn’t seem right for me. But I really think that building an all-out, extreme 8350 Vishera machine right now, is a solid decision for anyone building a system.

Please tell me AMD is going to do something great later this year? Perhaps introducing more mid-range hexacores that are a bit more powerful than current ones, at similar prices? I hope AMD is developing their APU tech, with full modularity between dedicated GPUs and APU GPU cores. I’d ask the same for Intel... but when the hell have they offered something for people that can perform, and carry a decent price? I’ll go right ahead here and say I favour neither or. I have better things to do than start a closet-jerking fanboy war. I’ve already dealt with many gamers who argue about which consoles (SMH) are better. I’m simply saying it how it is.

If some are still scratching their heads, wondering why I’ve titled the thread the way I did, basically, the only way I could perceive this is the way I’m looking at it now - as an anomaly, a natural anomaly of a generational leap in hardware. An evolutionary step of some sort in which power has to come to a crawl due to ridiculous amounts of usable power we will have in the near future. Perhaps Intel is to start this with Haswell? I have no idea... In short, all I’ll say is; hardware just isn’t making any sense to me anymore. Browsing and researching computer hardware is not the same, for me at least. It's very, very tiring.

Anyway. Those are my dilemmas, that’s what’s been killing me, and I hope someone has read this to the end, and hope that someone can clear my mind. Because I am not liking what the computer market has to offer today.

 

 

the rumor mill has AMD promising 5ghz stock clock no info on core count or such

just 5ghz and an $800 pricetag

The Intels are more optimized for Crysis? take a look see here :) 

http://uk.gamespot.com/forums/topic/29356784/crysis-3-is-an-example-of-how-well-amd-cpus-perform-with-well-optimized-engines

Another key factor is the software. The problem is that the hardware is far ahead of software these days. You get a hexcore but the software is only using 2-4 cores. The generational gap we are waiting for is the softare generation. The only time we usually see 100% CPU usage is when first loading an area. The rest fo the time the CPUs sit confortably at about 30-60% usage depending on your CPU. My i5-3570k sits at 37% usuage when playing games. Skyrim gets a bit higher at 45% when its fully modded.

A great example of this is crysis 1 and crysis 2. At first glance, it looks like crysis 1 is more demanding on a system, but crysis 2 is actually using more graphical features and newer technologies that are more demanding, BUT the software is just so much better optimized that it ends up using less resources than the first crysis.

Now as far as price, I agree, things are screwy. I think it is AMD and intel trying to create products and nudge out more performance simply to keep people buying products. If they just came out and said "if you are a gamer just get the mid range i5 and dont overclock it because you will get the same FPS in your games than with an i7" then people would only buy the lower cheaper  product. They create lots of products and set a price scale and just shut up. This gets people talking, people get competative, and people buy the best CPU they can afford regardless of their actual needs.

Then AMD and Intel make their money while we are all waiting for software to catch up to use all the potential.

Sadly you are right. There just isn't any major leaps in performance which most likely results out of the lack of competition . If AMD posed any kind of thread to Intel they'd be increasing the power of their chips by a lot ! But right now AMD is no worry for Intel and they rather focus on Iow-power cpus for the growing ultrabook market, since it's much more profitable.

(sry for weird typings n stuff, hate tablet typing)

 

 

 

 

There are several technology limiting factors playing in the x86-processor field these days:

1. Windows is the most used platform on consumer x86-processors, and Windows has reached it's height on what it requires in terms of system performance with Windows Vista. Vista was so heavy and full featured, that the computers in the market were not ready for it. They would be now though, run Vista on a brand new computer and you'll like it for sure, but thta's another story. Basically, since W7, MS has been kicking out features to make Windoze lighter and thus faster, W8 has even less features and is therefore again a bit faster, but it's regression, it's not improvement. The result is that the hardware requirements for Windows are actually dropping since Vista, so people don't upgrade their platform as rapidly. Also - except in North-America - Windows is losing terrain fast to GNU/Linux, Microsoft says that a lot of Chinese are running WIndows, but how many of those are legal copies, and fact is that almost all branded systems sold in the Far East come with Linpus, a GNU/Linux distro, not with Windoze, and the Chinese government for instance has a deal with Red Hat (the American open source software giant) for a customised chinese official version of Red Hat that is called Red Flag Linux, and obviously that's a lot of quality to get for free, so why would anyone spend money on Windoze right. In Europe, it's the same story, the former Eastern Block countries have never experienced the golden era of Microsoft in the eighties, so most run GNU/Linux, also because they don't all have huge budgets to spend, and they'd rather spend it on tangible things that "do something" like hardware, than on some legacy operating system, and torrents are not really a problem in Europe or Asia, so everyone just pulls a cracked windows off a torrent to play equally cracked or even bought games with (most games in europe are cheaper on DVD than they are on steam or online, because online or on steam, the prices are US prices, but frankly, the game distributors would not sell a great deal of games if they would only sell at those prices outside of the US, in Germany for instance, games are expensive, because people have more disposable income, but in Belgium or France, games are much much cheaper, because otherwise they wouldn't sell any).

2.The same principle goes for hardware: because windows doesn't require a hardware upgrade in windows countries, and GNU/Linux certainly doesn't require a hardware upgrade, people just don't buy hardware upgrades. Thsi results in a situation whereby hardware prices have never been lower, because the market is so tough. Noone really needs the upgrade, except video renderers and gamers if they want to play the few titles that cannot run at max setting on 5-6 year old hardware (there are not a lot of games like that, most games run just fine on an Intel Core Duo E8400 with an nVidia 9800GT 512MB). At the same time, people want portability, and the popualrity of RISC-platforms running some kind of linux based operating system, has never been bigger. There has never been any CISC-platform capable of selling 1.5 million units per day like linux-based RISC-computing platform are doing now. That distorts the market big time, or better, it open a new perspective: at the moment, you can buy a full featured x86 CISC laptop for less than 300 USD, whereas a Galaxy S3 will cost you twice as much. However, the laptop is much more expensive to make, it has 9 Li-ion cells, the S3 only has a single small one, it has a large screen, the S3 has a small one, it has a lot of internal memory, the S3 not, it has an expensive large silicon processor, the S3 CPU and GPU assembly costs less than 11 USD to produce, it has a lot of material, keyboard, HDD, power supply, etc, the S3 needs none of that, and the S3 has built-in GPS and a bunch of cameras and a touch screen, which the laptop doesn't have. And linux makes it possible to have a faster and more fluid user experience on the S3 than you would have on the laptop running windoze. So obviously, Intel and AMD are questioning their CISC-CPU production. Supercomputing saw a revival of CISC-CPU's with the AMD Opterons because they were cheap, but that fashion seems over also, because of obvious energy constraints and the price pressure on the CPU's, the energy needed to run those CPU's is actually costlier than the CPU's themselves, so large computing facilities switch to RISC arrays to save energy, and guess what, you need to buy more of them to have the same performance, but those RISC processors, as production reaches enormous quantities, are becoming cheaper too. So again, Intel and AMD are questioning the huge costs involved in developing new CISC-processors, because we're getting in a situation where the only reason why they are bothering is because of Windoze-PC's, which only seems to be a necessary for the conservative consumer market in North-America.

I'm involved in high performance computing, and frankly, when I see the price of a RISC-based high performance rig, it's comparable to the price of a full blown CISC-based consumer rig, and for that price, you're getting a lot more hardware quality and performance (like at least 10 times the performance of an i7-3970 rig). Of course, this is still a long way from what the average consumer wants to pay for a system, but for hardcore gamers, it would fit their budget. So once games are not locked down on the Windoze platform anymore, which is happening right now, because frankly, making products only for the conservative brand-infested North-American market, is just not viable any longer, the preferred technology will switch to RISC very quickly. Now there is no doubt in my mind that the consumer preferred technology under the influence of Windoze will stagnated further in North-America, it is a very conservative market and brands are more important than specifications there, so Intel and AMD will probably keep making x86-CISC-CPU's for quite some time, and will keep concentrating on producing them cheaper, with smaller dies, less silicon, and for a more expensive price. And that is happening already. What game would not be perfectly capable of gaming - even Crysis 3 - on a 5 year old Intel Quad Core or even Intel Core Duo extreme, paired with a more modern GPU? Since Vista, we've been at a point where the CPU performance doesn't matter quite that much anymore on a windoze system.

Now Android is a linux pioneer project, it will eventually phase out like iOS does, but then Google hasn't invested quite the same amount, because they've pulled a lot of assets from the community, and got those for a really low price in comparison to closed source systems. In the end, the linux kernel will be used for other proprietary systems, and soon, RISC-devices will be running full blown GNU/Linux distros (which is already possible, I know, because I run my tablet with Debian ARM -in dual boot with Cyanogenmod Android for the time being-, and it works just fine), and a lot of people in Europe and Asia have working GNU/Linux distros for phones that work, but that cannot be implemented due to ISPs and commercial arrangements of communications companies, which are often owned or co-owned by North-American communications kartels that lock down the platform, but they will realise where the money is in the near future also. In fact, it's started already, because the EU for instance has put a legal cap on how much communications kartels can charge for mobile data and mobile data in roaming, and that will only become more strict in the future, which really helps normal competition, because prices for data plans in the EU are dropping constantly, especially since ever more users switch to prepaid SIMs, which gives them tariff freedom, because in the EU, a communications company cannot hold your telephone number for ransom, if you switch provider, you can take your number with you. That leads to a situation whereby people buy their devices instead of leasing them with a subscription, which puts pressure on the prices of the devices, and leads to a different offering of devices being available. For instance, the <150 EUR Android phone category is well represented everywhere but in North-America. Phones that were originally thought for china and india, are now best-sellers in continental europe, like the Huawei Y300. Nokia sells more System 40 Asha-series phones in continental Europe (which were originally thought for the far east and south-east asia and other developing markets) than they sell Windoze phones (the Nokia 900 series phones are dumped for less than 200 EUR through Aldi stores, which is a german chain of bottom price food stores that is very popular in continental europe, that is the phone without any contract and without any branding or odexing or other lockdown, yes there is no trace of such dumping practices in european online stores, because those could be seen by north-american or UK consumers that still pay over 500 USD for the same phone, locked down and with a cutthroat contract). I know this may seem unbelievable to North-American consumers, but there is a reason why Nokia has problems and Huawei has not. Japan is also very commercial, has an economy that is to a large extent a clone of the North-American economy, and look at what's happening there: Sharp has taken in Samsung as shareholder to survive, because they make the screens for Apple products, and those have been doing very bad lately, so Sharp is losing heaps of money, and the only thing they can still sell, are small screens for low budget devices, not retina displays, not super high pixel density, but small screens that can be made by just about anyone. Of course such a small display costs Sharp much more to produce than it costs some mainland china factory to produce them, so what will sharp do: they will incorporate the higher display technology in the small displays and will try to sell them as premium products, for a low price, because they have the technology to make these high pixel density screens, and the chinese not yet, so in the near future, samsung will be bringing out 150 USD android phones with retina displays, not in north-america, but everywhere else in the world. That's just how that goes.

IBM saw this coming in the nineties already. They have learned the most valuable lesson in the eighties, where they have paid a lot of money to microsoft just to be undercut and lose the entire PC market, so they have learned in the eighties that it doesn't pay to try to lock a market down, and when linux came in 1991, IBM jumped on it, they sold their legacy OS/2 Warp to Microsoft, that was looking for a system to replace the 20 year old MS-DOS at that time, and Microsoft made NT from that, and is still using it as a base for Windows, and that's now a 20 year old platform, just like MS-DOS made it through 20 years and then definitely was used up. IBM then continued with RISC and linux/unix, which was picked up by Apple, because the RISC-based PowerPC chip was cheap and efficient, and the unix-based platform was ready, cheap and performing well, and they started a new interface revolution with it, but Apple was not strong enough to fight the market lockdown Microsoft had firmly in place at that time, and IBM didn't want to stand on the barricades anymore. So IBM evolved further with ever better technology, one look at Watson and you know that they know how to make a computer, but preferred to stay in the shadow, they are one of the largest contributors to linux, together with Intel, RedHat and Novell, and they are doing well, they didn't go with Windows like HP/Compaq/Nokia did, and they sold Lenovo that was largely Windows-dependant, so they didn't have to sack thousand of people in the last couple of years and didn't have to change their CEO a thousand times, etc... Intel and AMD are now under extreme stress, because they see Broadcom and Atmel and Marvell and nVidia making these cheap RISC chips that are doing incredibly well, and they don't have anything like that yet. Intel is jumping on the peripherals market like a lion and is throwing out all support products for the CISC-platform, like their motherboards, etc... Intel and AMD are now pulling the plug on development of their existing CISC-based chips to jump on development of RISC-based processors, and they will succeed in the end, but they will not be on top of the heap at first, but with all the experience they have, especially Intel, they will find a technological edge sooner or later, that will differentiate them and push technological advancement of RISC CPU's into a maelstrom, leading to incredible performance increases of RISC CPU's in a short period of time.

There are a lot more factors than this, but in any case, everything is pointing in the same direction, and that direction is away from the CISC platform, away from Microsoft and other consumer-leeching kartels, towards a more natural situation of priority to technological advancement. I think that 3 years from now, we'll be rocking a RISC platform with the latest steam games, and will have our legacy CISC-desktop in the closet, to pull out when we feel like playing an old game. So in any case, I would not invest in expensive hardware anymore until about the end of 2014, when the shift will happen, except in North-America, where the technology in the consumer market will stagnate until at least 2017 if Microsoft doesn't shift to the subscription model for Windoze, and to 2015 is Microsoft does switch to a subscription model for Windoze.

Anyway, that's what I think, I could be wrong, but I hope I'm not, because I've invested in a modular scalable array system already for my high performance computing needs and I would love to be able to run games on it natively lol.

You sir win the award for longest post ever! While you have valid points, most will never see the light of day due to the extremety of your wordish language. Regardless, I solute you!

But I agree. I think its a software problem, and you brought up a good point about the OS itself being a market driver. My question for you is, do you think vista is better feature wise than windows 7? I was always under the impression that a lot of the features were the same.

@Nick_Fury

Yep. Heard that. Apparently it’s called the “Centurion”. And it’s supposedly dirty... If the stock clock is 5GHz, I wouldn’t be surprised if we were able to hit the mid 6GHz range without any problems... possibly just shy of 7Ghz? Maybe 7GHz is crazy... but... who knows. Remember those AMD Tweakr chips a few years back? Good God... AMD have really shown over the years that they’re CPUs have great overclocking ability. The Phenom II X4s really showed that, too.

The FX 6300 can apparently hit 5Ghz, stable, on air cooling. The pricepoint of Centurion seems extreme. But.. all of that depends on what they have to offer. Unfortunarely, with CPUs, there’s no surefire way of understanding exactly what it’s capable of unless you study the inner workings, or witness actual benchmarks. The only thing we have to really go by is the CPUPassmark. I get a reference point that way, and a general understanding of what a CPU is capable of. So... if the Centurion reaches something well over the 10,000 CPU Passmark (dare I say... comes close to Server-processor territory? 20,000?) ... then AMD have done something absolutely insane. Also, btw, when I heard of its codename, Centurion, they said it would be an octocore. Unless THAT’s not true... and AMD go apeshit, giving us... a ten or twelve core CPU? However... 8 is quite extreme already... And already not fully realized in terms of memory management, if you ask me. So, there you have it. That’s what I heard.

 

@Cooperman

Ah, I’ve come across that not too long ago. And I was astonished by the results. However, they’re quite questionable. You’ll naturally want to ask how in the hell you’d otherwise see the OPPOSITE in other tests. I remember the Crysis 2 (maybe 3 as well) test that Logan did here on Teksyndicate. The Intels DID push forward, past the AMDs.

 

Is that some sort of driver update or something? Some sort of update for the game? No matter... if it’s legitimate... that’s very, very impressive. And that’s without an overclock. Some nice case cooling, and CPU cooling... and you’re looking at some very nice, reliable, rock steady overclocks. The Vishera is the system to be building right now, really... Someone else will be paying over a grand for that Intel processor, and it’ll be beaten out by their next top-tier quad core i7... lmao.

That FURTHER proves my point.

 

It’s up there with the 3970K?! Isn’t that thing.... ummm... about a thousand dollars? Once you’ve got CLOSE to the 10,000 passmark. That’s basically future-proofed CPU power right there. And the 3970X has a couple thousand higher CPUpassmark points over the Vishera... No thank you, Intel.

@feralshad0w

You’re definitely right. It’s something I’ve noticed for quite some time now. Ever since I basically noticed how consoles have STALLED that progression in software. Now... more than ever before in history, I believe it is the best time to be a PC gamer. Of course... while waiting a little bit for the FREAKIN’ HARDWARE to get in check... But, yeah. Both new consoles are now x86 compliant. AND YOU KNOW WHAT THAT MEANS, BOYS?! WHOOPTY-ASS CONSOLE PORTS... Are a thing of the past. Now LONGER will we have to see our predominant, definitive versions of games (PC platform multiplats) be degraded, to diminish the gap between PC and console. We have literally been experiencing a SOFTWARE bottleneck for many years now. And I believe that wait it OVER. Marketing also plays a huge role in the PC multiplats differing so much. You know what I say?! Advertise both versions of games on SIMILAR settings. That way, folks (console market) won’t be so bummed out when they see the PC version of the game they’re playing, running on full settings. I love how Crysis’ graphics settings consisted of: Flacid Dick, Low, Medium, High, Very High, Ultra High, and BBC. Speaking of which... I came across a video of a system running Crysis maxed, on crossfired 7950s... running at 70FPS. That’s just wonky as hell.

I’m not criticizing the original Crysis, just pointing out that the scaling in performance is ridiculous from the software side. BUT... Crysis is still a display of PC computing power, and what the platform has to offer for gamers.

 

@Scia

Yep, as I’ve said before Nick_Fury, i believe that AMDs new Centurion chip will do JUST that. It’ll give us a shocking leap in performance. And what wouldn’t surprise me? Intel responds by putting out a chip that offers not even 10% more performance, for double or MORE the price of the Centurion. Intel’s old ILLUSION marketing. I should get a job at Intel. I think I’d do great brainwashing folks. :D

YAY, INTEL!

 

@Zoltan

 

Man, that’s a lot of text. But.. it’s quite crazy. I’m reading through everyone’s replies, and I’m basically seeing the same thing applied to different things. Usable power becoming overwhelming, and a stalled progression in both software and hardware. And DAMN.... what’s even CRAZIER, is that you’ve LITERALLY said exactly what I’ve been thinking in regards to timing. I believe that 2014 will be a very, very special time. The question is.. could I wait that long? :P

As for RISC and CISC processors. I suppose that COULD be the one and only permanent compromise we’ll ever experience in the history of computing. Phasing out CISC for RISC processing. Before we ditch electric-based computers, and switch to optical computing (my prediction, this will happen after Intel’s 4nm CPU platforms come to a close.... sooooooo, 2025 or so), rendering EVERYTHING before COMPLETELY INCOMPATIBLE.

 


 

 

And thanks for the awesome replies, folks!

 

Can someone dumb this down for me? From what I understood, the rest of the world is moving to linux for numurous reasons (cheaper, open-source, etc.) and we could eventually be using PCs that use light instead of electricity? And the rest was about the economics of arriving at these technological breakthroughs from what I gathered. 

@jon666

 

Basically, i'm talking about a very subtle, yet noticeable anomaly that has creeped up on us in the computer world. The progression in power of hardware has dwindled, and has become very sporadic and unorganized. Useable power in computer systems has reached a very, very high level (people still using Nehalem i7s and not giving two shits about the hardware that has come out in the last couple years).

Naturally, software has also been very wonky to respond to this usable power. It could perhaps be a purposeful, pre-meditated decision conceived by the big players in the computer industry. Not to sound like a conspiracy theorist or anything :D. But, it could very well be possible.

So, yeah. Down to the basic basics. Prices don't make ANY sense anymore, for what WE'RE GETTING when we purchase hardware. And the SCALING in power doesn't make any sense, either. Not one bit.

As for optical computing. That most definitely WILL happen in the future. But not for AT LEAST the next couple of DECADES. If not more. There is a very huge downside to this leap to optical computing.

Absolutely ZERO compatibility with everything we have so far. All our precious games... music... photos. Everything....

In the computer abyss.

R.I.P Oblivion and Fallout 3 game saves.

 

 

 

 

 

I might be wrong here but I am pretty sure the biggest improvments in the x86 platform the last years was powermanagement. The chips we have today are way more energy efficient than the chips we had 4 - 6 years ago. I guess that is what they are selling though. I would personaly much more prefer a more powerfull CPU over a power efficient less powerfull CPU. But when you think about the other improvments they have put in the x86 platform that isn't just more Hz you kinda see that there has been a completly different focus than they had e few years ago. But again, I might be wrong. Not defending the ridiculous prices though. Especially for Intel, those CPUs are way to expensive. 

it's down to the game using all 8 cores/threads i expect  more and more games doing this soon with the 'next gen' consoles useing 8 core cpus so its going to get interesting in the AMD vs INTEL gameing in the next 12mths.

Basically there are different things happening.

1) The current software is not taking full advantage of the current hardware, so there isnt much drive for companies to invest in much more advanced hardware for the average consumer.

ie: Most software won't utilize 4+cores of a processor.

2) The current hardware available to the average consumer in technologically inferior to what is being developed for niche markets due to compatability, econimics, and marketing.

ie: Having to replace industry standard connectors, sizes, and other hardware, AND having to completely re-design software/firmware for new hardware technologies. You can't install windows on your car computer.

3) Future technologies such as optical, quantum computing, and different digital technologies are inevetible an are slowing coming through to something tangible and within the forseable feature (not for all the technologies I listed)

ie: the newer revolutionary technologies are coming up, and we can see them being useful on the horizon. The kind that are going to change everything, but we dont know who will developement them first, how they will be controlled/freed for public use. There arent in the near future, but they are coming and there are lots of guesses as to how they will change the industry and the world.

Don't forget the price-to-performance ratios, and ultimately, what i'm trying to say; scaling in power has been extremely unorganized. The progression of hardware has been quite messy the past couple of years. Really, that's the best way to put it.

 

@kriss120

Yeah. Just about the only noticeable thing. Power management. But the cost, and performance is just craziness. When you take a look at a Phenom II X4 965, and compare it to an FX 6300, you just kind of scratch your head, and realize that the latter is the obvious choice for the money. But, preportionate to how old it is... the pricing makes no sense. You'd think that old Phenom would be cheaper at its age, and the 6300 would be a bit more powerful than it is.

 

Unless... of course... the big players full-well know exactly how much usable power we have, and how long it is to last. More than we'll ever know.

 

just an opinion, but power management is overated look at how hearty the 2600k was for overclocking

the smaller nanometer process demanded the better power effeciency,so it's a mandate not a feature.

the higher transister count is the feature. but intel screwed the pooch on heat management. lets hope they get it right with Haswell . then lets talk about processing power left on the table add a graphics card for for gaming the leave the direct compute features of your 3770k sitting idle  . how much hd 4000 processing power is waisted? remember ten first gen ps3s running linux  became a super computer so the pentagon bought  200. want to run an invisible botnet write a hack that only adresses the intel hd graphics on pcs using discreet graphic cards who'd notice??

 

I see AMD making a big push on the CPU business, obviously there are very few legitimate rumors about steamroller (and even haswell at the time of writing), but what im seeing in the engines that are optimized for multiple cores is AMD rolling out ahead. this will make it very interesting to see it the steamroller 8 core competes more with the 4570k or the 4770k, or if it blows the doors off of both. either way i think AMD is going to close the gap due to software catching up to their arcitecture. it will definiately be interesting. i would reccomend saving money for upgrades either way.

AHHHHH, these are super long posts, I didn't bother reading them so don't get angry if I say something that someone already did. 

I was actually do a FX-8350 this past December anticipating something new from AMD or Intel. As Haswell is coming up soon I'm gonna pay careful attention to the performance of these chips. When I do get around to my build, ideally I would like quality parts that I know I will be getting the best bang for my buck, and honestly right now, even with AMD GPU deals, I don't feel like it's the right time to upgrade. I feel as if something awesome is coming.

Hopefully things ramp up toward the end of the year and we see some really interesting stuff. If you read up to this point, I commend you considering what I said in the first sentence of this post.

Well... any possibility of AMD rolling out new Mobos this year? Apparently, the 9000 series is coming out this year, and they're supposed to be NASTY.

Its just a complete paradigm shift. If anything the last decade or so has taught us that Moore's law is an economic, not natural law. The real money in the technology sector has been poured into smartphones and tablets, with consumers paying between $500-$800 every two years or so (even if that cost is hidden with a subsidy, it still exists) for a phone, and even more for tablets, and this has shown in the extreme revolution of those chips and form factors. We went from "useable" single core systems to near laptop parity quad core chips with 1080p screens in about 5 years. That's pretty insane when you think about it, and I think it opened a bit of Pandora’s box with regard to the changing of the compute environment. Couple that with the primary driver of the gaming market being consoles which have remained static for about 7-8 years (along with the prospect of diminishing returns, which will simply convince a lot of people that the graphical systems and technology behind them is “good enough” and not put there money toward it) and the economic drivers for a steady progression in desktop x86 based systems simply have not been there. Couple this with a trend that x86 and CISC may be getting a bit long in tooth, and simply lack alot of advantages that RISC has, and its no wonder the progression has slowed. Furthermore I think the entire landscape will change over to emphasize a dynamic compute environment in the coming years. Think of the basic idea of your "phone" acting as your primary compute device, and when it suits you to work in a more traditional way (keyboard/mouse/monitor) you can use a "dumb", that is no large computing power, terminal. Also I think the large promise that things such as the Nvidia game streaming servers is that they can scale to different uses. The real promise of that system is if I can run a server on my intranet in the case of a company, and divide my resources more based on who needs them (think CAD stations and rendering artists and the such) this way I can avoid alot of the cost associated with just having the ability to do those things (like buying a professional graphics card for a few grand when only something like 15% of my work may require it). I can scale my compute needs better, and I get away from the idea that I have to work from a specific space. Basically I think the traditional landscape of large powerful compute machines it going to be replaced with a mix a small scale devices and large server based resources that will allow me to use compute resources in much more dynamic ways, and which has already been shown to be preferable to large portions of the populations (think of how many people you know who have invested in smart phone or tablet vs a laptop, let alone a desktop)

AMD is not about per-core performance, since their very affordable flagship atm has 8 cores.

PS4 and XBOX also have 8 cores. The whole gaming world will be focused soon on 8 cores.

Also their APUS have much better graphical power than intel's counterparts-in-pricerange, look at Temash, Kabini and Richland, so the ultrabooklands of intel are not really out of firing range aswell.

Don't count AMD out yet. Intel has most definetly something very big to worry about.