Is AMD going to die?

Tides turn. Who can say AMD won't be on top in five or ten years?  I believe one of the problem is people getting a bad vibe from reviews and reviewers etc. I'm pretty convinced a majority of people using computers don't need more power then a half decent AMD cpu can offer from their current line up. But it's this notion of wanting the absolute best, and listening to reviewers who sometimes (i think so anyway) forget that there are people out there who don't need high end CPUs. Obviously it's always the consumers choice, i'm just saying people who are not necessarily very much into hardware, might be mislead thinking that Intel is the only viable option. This is why my next pc will be AMD powered, throwing some love their way. 

You know, there are some technologies which might shake things up.

Think of optical processors (photonics), spintronics (they use the spin of electrons to carry extra information), or maybe plasmonics (they use photons and electrons that have been entangled together, and thus can carry quantum information via means previously unavailable without losing it due to interference or instability as easily).

Personally, I hope we opt for plasmonics. It would be awesome, because spintronics does sound awesome, but it has it's limitations.

I'd also be interested in seeing if AMD could create PCIe over Optical Cabling with copper for power (like Thunderbolt, minus the requirement for DisplayPort), maybe PCIe over optical interfaces (lasers instead of copper contacts, or fiber carrying 100Gbps per connection using a slotted design), maybe a USB-like interface using optical cabling internally that could replace both SATA, SAS and USB all at once to maximize flexibility, with enough bandwidth for all of these things. =)

I mean, we're reaching copper's limits, and Graphene cabling might not get here in time. We've got optical, why not use it?

I don't think optical integrated circuits are going to make it to the mass market unless a new player other than AMD or Intel comes along. Reason: most patents that have to do with optical technology are non-US.

I also don't think there will be a full switch to non-metal litho designs soon, because before investing in it, the patent strategy has to be construed, in order to prevent open source designs. That used to be the problem with Intel in the 80286/80386 era, they thought that nobody could copy them, so they insufficiently protected their designs, but along came AMD, and the rest is history. They are not going to make that mistake again.

So I think the first applications of pico-lithography we'll see is in parts of CPUs and GPUs, like FPU cores. I think Intel and nVidia (whether or not the same company by then) will introduce it first, and try to block AMD, and a years-long legal battle will come, and other manufacturers will grab that opportunity to enter the market with non-x86 based high performance designs that are based on very cheap metal ultra-small lithos with new internal cooling and quantum side-effects-reducing materials, which will again slow down the development of graphene processing designs. Also don't forget that IBM has a lot of patents and research into pico-designs already, they might make a non-x86 design that emulates a x86 CPU faster than any native x86 CPU can ever run with QEMU or another virtualization technology.

Also, a route that is highly probable to become very important, is bio-organic designs, which are not easy to patent and may be cheap to produce. Those might become prevalent way before graphene designs ever make it to the market.

There are still many problems with graphene, they still theoretically require a higher switching voltage than bio-organic designs, so it's going to be harder to work around quantum side-effects. A transistor is still a transistor. Intel is still focusing on making smaller lithos with copper, and they're going to have to solve serious problems, that they've run into already with Haswell, because as it turns out, Haswell chip production is nowhere near consistent, the specs of every even equally binned CPUs (and Intel already added a butt ton of SKUs to sell them at the highest possible price) is very variable, and many people that have bought an expensive Haswell product are finding that they didn't get their money's worth.

So yes, it's definitely going to be interesting to see what happens next. My bets are on open source many-core designs that are so fast that they can emulate any CPU instruction set faster than the respective native instruction sets. I think that in the long run, that's the only really viable option if the US wants to stay on top of the world market in terms of microprocessor technology.

Wow. Awesome stuff, Zoltan.

I think I'l disagree with you on bio-organic designs, because we've barely begun to create our first life-forms, how could we even begin to understand how to design a brain or processor using DNA? If we understood our own brains, than we might be able to create something like an organic CPU that we could create in a lab and that might be able to assemble itself.

However, we'd still need other things. If it's a bio-organic processor, it'll need sugar, proteins and whatnot. So it would have something to work as a "body" of sorts, and it would also need a waste disposal system. However, since photosynthesis could be used, we could just add an LED light into a photosynthetic design to recycle waste, or use modified bacteria that (with appropriate electrical charges) could reverse the acids and waste produced by cells into usable fuel they could use once again.

It'll be several decades (I wouldn't hold my breath for 2050, even!) before we get into bio-organic processors. And we'll have a lot of work to do both in terms of moral grounds and in terms of legal regulation of such technology.

Although we do have some interesting bio-organic processing discoveries made, we're still a long way away from having a working product that makes commercial or enterprise-level sense.

Computers with rat brain cells have been shown to work, yes. But we haven't yet engineered the DNA of an organism to be a computer yet, nor have we figured out a way to get enough processing power to make it worthwhile (economically) yet. We also don't yet fully understand the "programming language of the brain", how exactly memories are stored, etc.

There's still too much to figure out before that.

As for Graphene, when you talk about it you refer to the lack of bandgap, necessary for transistor behavior (0 and 1, on or off). But why do we need a bandgap? Couldn't we just make a more analog-like technology and instruction set instead? Maybe make an analog-compatible RISC-like technology based on Graphene, Open-Source design.

That would be epic!

Well, it's not as simple as moving from XBox One to x86. Mantle may help, because it's "console-like", but it isn't a single API to rule them all. Mantle is for PCs, that's it; not consoles.

The API isn't the same, but it's more similar, so the learning curve is less severe. And since the architecture is similar, AMD's GPUs get a lot of performance help from ported versions of console games. =)

Hopefully all these GameDevs learning multi-core optimizations might make Intel rethink their "4Core4EVER!" philosophy of customer backstabbery. (Is that a real word? Well, if it wasn't before than now it is... *Goes to Wiki-based dictionary to start trolling...* )

If games end up working better on several cores, it might be a big boost for AMD. And Intel might have to catch up. =) But I'd love to see AMD come up with a better chipset, and better single-threaded performance (even if it's just 15 to 30% single-threaded performance improvement). I think 8 cores if good, and 12 is more than plenty. 4Ghz seems like enough, if we can get more single-threaded performance out of these CPUs.

Bio-organic technology has been studied in Europe since the 70's. It was already talked about enough to inspire Dick Maas into making the movie "De Lift" in 1983. In the mean time, Google, well, Sergei Brin, has been funding development of bio-organic 3D printers in the Netherlands, in Maastricht, and just across the border in Germany, in the European Supercomputing center at Jülich, there is a system running as an emulated human brain. IBM has a similar experiment that they recently demonstrated, but it was much more limited to neural networks analysis and reconstruction than to the actual holistic emulation approach.

I don't know what's going to happen, I do know that research is probably a lot more advanced than the public is allowed to know, because as always with these kinds of projects, governments and big corporations will explore the possibilities for ultimate weaponry and acquiring the power to take over the world before thinking about letting citizens make better toasters out of it.

Hhmmm. I'd love to see links about that sort of info. =)

I don't know if Xbox 360 has been a reflection or a demonstration, but it proves the point quite well. People (consumers) don't want to upgrade hardware, they want software optimization. If you compare launch titles on the 360 to current games like GTA 5, you wouldn't be able to compare the quality. So as long as big companies like Microsoft and Intel can see that they can use the same hardware and still get an increase in quality purely due to optimization, they won't need to upgrade. But just imagine if the gain was two fold...

This same kind of optimization of software that causes a massive difference in image quality over the life of the console will simply not be the case with the next generation of consoles. They use an x86, which the majority of developers are already very used to using. This means that the launch titles will probably look the same as any games released at the very end of the console's life cycle.

Kind of like when GPU manufacturer does a relaunch of a series of cards, using the same architecture previously used. Any performance gained from new drivers is negligible due to them already optimizing them properly in the first place.

In short no I don't think AMD will go out of business anytime soon because they have:


1.) Been effectively reducing and now reversing their quarterly losses by cutting non-viable and less important parts of the company.

2.) They currently have a competitive if underrated product range, reasonable market-share and are introducing a range of new competitive products and technology solutions such as arm solutions and processors for low power devices (which is a huge industry). Also GP-GPU accelerated processors are probably going to be big soon, especially with opencl gaining traction and AMD has a competitive advantage in that, this should also help them in their entry into the low-powered and existing server sector. They also have an X86 license so could implement something like what NVIDIA was trying to do with maxwell to make it less dependent on the CPU and increase non-parallel performance.

3.) There is the serious possibility that if intel somehow managed to put AMD out of buisiness their company would be split into pieces as an anti-monopoly measure and their x86 license opened up more. 

________________________________


Also in regards to what zoltan is saying I think the computer industry is about to change alot with the emergence of Sata-express (using PCIE lanes), thunderbolt (using PCIE lanes), PCIE4.0 and DDR4, vertically stacked memory as well as the emergence of GPGPU and APUs with technology like HUMA and lower-level access. It looks like PCIE lanes are going to become some sort of standard data lane for peripherals and components. There is even talk of 128GB DDR4 ram modules..

No they are not going out of business anytime soon, they own the console market, and have a great price/performance ratio, plus there new GPU's are a really good bang for the buck as well. Personally 99% of my computers sense the K6-2 550Mhz chips where released have been built/bought with an AMD Processor, and as long as they get there Linux Drivers up to spec(new beta drivers are doing a lot better), I will keep on buying, and building with AMD, in fact I just rebuilt my machine yesterday with an AMD A10-5800K, MSI FM2-A75MA-E35 Motherboard, Team Vulcan 8GB (2 x 4GB) 240-Pin DDR3 SDRAM DDR3 1866 Ram, and it only cost me $254.97, which is a damn good value for an everyday system, that can do some decent gaming, till I can afford a decent GPU. 

I totally agree with the anti-monopoly thing. Intel doesn't want to put AMD out of business, otherwise it'll be split up and the fab might have to be opened up, otherwise the same problem with AMD might happen all over again.

As for Sata Express... check out SATA 3.2, because that's a really interesting thing. 16Gbps using 2 PCIe 3.0 lanes along a single cable.

And I agree with everything you've said. I also think that we'll soon see everything run over PCIe. From USB to SATA, and everything in between. Heck, maybe even video cables as well. Optical cabling to transmit data in common cabling technologies might eventually become something common, but that'll take some time - but imagine 400Gbps over a single fiber optics cable would be epic! =)

That is what I am getting at with the PCIE lanes, from memory PCIE4.0 is twice as fast as PCIE3.0, by tieing these data lanes(peripheral-connections) into PCIE they get direct access to the CPU bypassing add-on controller chips and tieing performance increases into the PCIE lane generation. I also think this is also a step towards a more universal serial bus so to speak, even for ethernet but possibly in different performance grades. This whole idea is revoloutionary. Interestingly IBM holds patents over implementing fibre optics inside the processors themselves ^.-.

You could have maybe 10 16Gbps sata ports using 1 PCIE lane each and you would only use 10 lanes, or interestingly the lanes could alternatively pass through some sort of controller interface so for example you could have 10 lanes dedicated to 10 sata ports but it passes through a controller which has like 4 lanes going to each port, based on how many sata ports (or whatever interrface) is used it could allocate lanes accordingly. Because I assume your going to run into lane limitations so you would want to have 'managed lanes' so to speak. This to me sounds like a much better way of handling data transfer between components than what is used now.

Yeah. It does sound pretty awesome. Although why do we still need all those pins for PCIE anyways? If we just used fiber optic cables or lasers for data communication, and let the motherboard or PSU handle the copper (or graphene) electrical power, it would be so much better! =)

Yeah that would be cool but I think it is because we have PCIE3.0 and 4.0 already and implementing fiber into the CPU requires new technology, it would remove bandwidth issues (and IBM has patents with this and this), it's pretty cool that we can keep doubling the PCIE bandwidth every generation though.

There is alot of incentive for IBM to integrate fiber into their systems like this, 1Tbps per lane is really alot (imagine what you could do with that....) and it greatly reduces latency and apparently reduces power usage alot so there is a big incentive for them to develop it.

Relative to IBM they did.

They could still be around like freescale, but not being able to compete in PCs or servers.

What about if the 8350s were multithreaded, 2 way or 4 way? That would put AMD ahead in multi thread stuff.

"Maybe make an analog-compatible RISC-like technology based on Graphene, Open-Source design."

 

PowerPC?

I am already used to 70% performance loss when emulating x86. I used to take parts from PowerPC console games and emulate the graphics from PC games. Now that that will not be possible, using my 7970 will be like using a intel integrated card. So it's back to x86 PCs.