AMD's strategy with Mantle, TrueAudio, SteamOS and more

The one with the most patience wins!

Oh the seemingly inevitable push toward mobile computing. Certainly a majority of the tech consuming populi of the earth will salivate as they lap up the latest tablets and phones. They don't care that mobiles are incapable of authoring and rendering full 3d scenes for, say an independently produced indie film. Likewise they do not mind the inadequacy of touch screen devices when considering detail oriented work of any type. The variance of need between enterprise and the general public users is not as cut and dry as people who need keyboards and mice and those who don't.

My point is, the future is not exclusively held hostage by mobile computing. We have a deeply important need to embrace desktop computing for the sake of the advancement of technology itself. The next generation of system administrators are not going to learn networking by pairing their tablets together. This is an example, among others (which I struggle to think of while I make a feeble attempt at writing this on my nexus 7 at McDonalds), how touch screens and voice commands cannot facilitate the proper education of young tinkerers and hobbyists.

I believe mobile computing is here as an addition to the world of silicon rather than a replacement. The markets may sway because of the proportions of people backing certain technologies. This is ok, but as a power user I still get excited about and will continue to support advancements in PC technologies because this is where our future tech savvy geniuses are born.

its beautiful! i see mantle as AMD's light to the gaming world, in a darkness of microsoft. MANTLE is WAR with microsoft, by making it open source to everyone, and having it help with next gen consoles + pc's, we will FINALLY see the power of 2 year old technology shine! like it should have 2 years ago! (im looking at you microsoft)  

Partly true, but two things:

- In the beginning of x86 "PC" computing, enthusiasts like myself didn't get a PC right away, because it was not powerful enough. My first computer was a self-built UNIX box with a bunch of Motorola CPUs soldered together in series through air bridge technology, and it was - by the standards of that time - a super computer by definition. Result: I could render fractals and julias faster than anyone else, so not very useful, but very powerful, great for learning (I was a little kid at that time), a wet enthusiast's dream, but practical benefit absolutely zero. So I bought an IBM 5150 PC from a bankruptcy sale to have a computer that I could actually use, and I used that until superscalar x86 CPUs came out, so that's a very long time with an 8086+8087 at 4.77 MHz.

So low powered devices may seem unsuitable now, and there will be a transitional period just like there was when the PC came, but in the end, the practical solution with the broader user base, will prevail, and that's already happening, the devices user base is already equal to the PC user base, but the PC user base is in steep decline, and the devices user base is still growing.

You also shouldn't forget that not every device is a phone or tablet, TV's are devices now, watches, PC's-on-a-stick, NAS boxes, routers, etc... in number of computing machines, most PC users that may think PC is never going away, probably already have more "devices" than PC's right now.

As a sysadmin, you don't need a PC to SSH into your server boxes, you can do it just as easily with a tablet. Tablets often have better screens than laptops, there are some really good keyboards for tablets. I usually carry a quite powerful dev-board PC with a quad-core ARM, that I can interface with pretty much everything, instant result for controlling a bunch of stuff, much faster than any PC where I would need an I/O board instead of the small I/O shield of the dev-board. And I run Fedora 19 on it, and it works great, I'm not missing anything for normal usage in comparison with a PC.

That together with the change that will come in display technology, towards "personal video", and 3Ddisplay+environment integration that is already being developed, there is no way that the PC market is going to remain a large consumer market. PCs will still exist, and also evolve, but they will become much more expensive.

Also, video rendering on small linux devices is possible. PC's are held back by Windows and proprietary binary graphics blobs, but devices aren't, and there are already Android apps like Magisto that do video editing, almost in real time, on devices, with functionality that apps on PC don't even have. Not everyone can use Adobe Premiere, but even a small child can use Magisto. Adobe is now already giving Lightroom for free on CC with Photoshop, not because sales are so good, not because all photographers have migrated to linux and are using Darktable, but because a lot of photographers are using Android apps, and to be honest, some of those android apps are really good. Some Android phones have 4 k video cameras, but the investment needed to make a 4k video production (or even viewing) PC are huge. Might as well edit it on the phone itself, and that is often faster than on a PC, because there is data copying going on, linux devices, even as under-powered ones as ARM phones and tablets, still perform great because they don't have Windows to drag along, the codecs needed for rendering video in this or that format are built-in the cheap ARM SoC hardware, efficient rendering is a given with these things, you don't have to buy 3000 USD worth of proprietary overkill to render stuff like on a PC.

Agreed. The Desktop PC isn't dead; we just need to see meaningful improvements that aren't held back by slow hardware or slow software.

Remember the 2003~2008 "explosion" in PC performance, with multi-core and "moar Ghz" ? That was a thing, and it worked only because both software and hardware were improving, and we were seeing great reasons why to invest in better hardware. Software saw big improvements... but now we're kind of stuck. That's because with no significant hardware improvements on the CPU side (and memory too), no software came to push things any further. An example is this; if you have an i7 2600K, the only reason to buy a new i7 4770K is to get a motherboard with newer featureset. The performance is nearly the same, and with overclocking you might get nearly identical results.

Given that, it's hard to imagine that software devs are ever going to make software that'll use more power until hardware manufacturers start making software which needs and/or significantly benefits from said extra CPU horsepower.

Woah. Big comment. Took a while to read, but overall very, very good. +1 cookie point for you. =P

But you're right. The market has historically migrated in the direction you've mentioned. But one thing to consider is how Microsoft will react to this. Even though they are shifting towards devices and services, you've got to remember that those things aren't where their main source of income is. Even if they do buy Nokia, so what? Windows Phones just aren't selling. Windows RT tablets aren't selling. The Surface took forever to sell, and the only good news in a *very* long while for Surface is the success (or so I've heard) of Surface 2.

Portability is a thing. But that's because PC hardware and PC software aren't pushing each other along. Without any meaningful benefit for better hardware, no developer can write software that uses said better hardware. Gaming has always pushed things to their limits, and only the high-end gaming desktop PC has managed to keep us it's sales recently. That's AMD's core market, so AMD wouldn't just ignore it, because AMD is in dire straights (financially) and cannot afford to ignore it's core market the same way Microsoft could (and did!) with Windows 8.

But you do have a point. People are moving away from Desktops. But I don't think it's just because they're using their phones, or just because they already have Desktop PCs which do what they need. I think it's because the Software Devs aren't writing software which demands more hardware power.

A Software Dev thinks of sales. Making software that runs on more powerful software can get people excited, but it also limits the market of people who you can sell your software to. And that's bad for someone trying to sell something. But with CPUs not getting much faster every generation, we've seen software just plateau. Nobody is writing software that could meaningfully benefit from faster CPUs, or that needs faster CPUs, because the only way you could get that is with a 12-core Xeon. We're stuck at 4 cores, because Intel doesn't have any competition and doesn't feel a need to make anything better. And because it isn't selling anything better than before, people aren't seeing a reason to buy anything new. And so, Intel is shooting itself in the foot as we speak because it crushed the competition so much it stopped competing, and now it's only a matter of time before people migrate towards other hardware solutions; whether it's ARM or something else entirely. (I've heard of chinese companies making new x86 processors faster than Intel's, whilst using AMD's chipset! And I've also heard of a startup company making 3.0 Ghz ARM chips that might give Intel a run for it's money!)

So with PhysX getting new stuffs (http://www.bit-tech.net/news/hardware/2013/10/17/nvidia-announces-flex-unified-gpu-physx/) I guess we see more of the same from Nvidia and it's obvious they're not gonna play ball even if they'd get the choice. Keep on dividing the gamers, that's productive.

Guess that just means the old frustrations of seeing the Nvidia logo back in games are coming back, with a vengeance :p

http://community.amd.com/community/amd-blogs/amd-gaming/blog/2013/10/17/the-four-core-principles-of-amd-s-mantle

Sure, sure. But for that to take off, you kinda do need actual optimizations for other architectures, aka Nvidia, and AMD is not going to waste time and money on that themselves, so for it to be a success it'd require Nvidia to take part. I don't see this happening anymore.

Maybe some random devs can do some optimizations, but to have a real advantage Nvidia would have to do the dirty work.

Seems like the world of gaming just will stay divided.