AMD's strategy with Mantle, TrueAudio, SteamOS and more

https://teksyndicate.com/forum/pc-gaming/amds-strategy-mantle-trueaudio-steamos-and-more/155972

(This started off as a forum topic, but a friend of mine who's a bit more experienced with the Tek forum recommended I put the topic in the blog area. So here it is.)

This is pretty much me babbling on about my own thoughts and opinions regarding AMD's recent decisions, and my attempts to explain why they've made certain decisions. (Don't ask me for the references and links. There are too many, it's nearly 7AM and I've been working all night and haven't slept, and I'm lazy enough I could make Homer Simpson blush.)

First, let me start off by explaining some of the background info regarding their recent decisions:

AMD has attained the "holy trinity" of Videogame consoles: Sony Playstation (PS4), Microsoft XBox (XBox One), and Nintendo WiiU. This gives them a HUGE advantage in the gaming market, since this allows cross-platform games that run on consoles and PCs to have the already-added benefit of being optimized for AMD hardware.

Since the two main consoles (XBox One and PS4) have eight-core design and AMD GCN architecture (HD-7000 series and R7 / R9 200 series) GPU design included, it's safe to speculate that we might see OpenCL programming taking place for the GPU (well, it's basically a glorified APU, but whatever...), and we'll also see optimizations done for more cores, meaning better multi-threaded coding should be included.

Also worth noting is that Windows 8 has really cooled the enthusiasm for the PC Market. It's been predicted that Windows 7 will become the next Windows XP, since Windows 8 isn't really living up to the hype. Windows 8.1 is promising to solve this issue, but AMD can't wait around. Even companies like Lenovo are including a Start Menu program to help users navigate through Windows 8 better. This means that even Microsoft OEM partners have seen the horrible idea of MetroUI/ModernUI, and how badly it's affecting their sales.

AMD is already in financial troubles lately, and is nearly hemorrhaging money like there's no tomorrow. The GPU and APU market are what AMD does seem to "get" right now, and you can bet they're counting on that to make a comeback at Intel and nVidia.

AMD has always been a big supporter of OpenSource. They've had good drivers for Linux for some time now. But with Windows 8 it's really been shown that Windows might end up being a pitfall to ruin the PC Desktop industry; sales are down and they keep falling, except in the high-end gaming area. AMD would be absolutely ruined if the PC Desktop market crashed, since it's been known for a while that AMD doesn't perform well on laptops/notebooks, and AMD can't make a good ultrabook-like competitor to go head-to-head with Intel right now.

And well, that's the background info. It's a lot, I know, but let me bring in why AMD is choosing to do right now:

With the announcement of SteamOS, AMD has a big advantage: they don't have to do a lot of coding to make great drivers for Linux, which gives them a short-term advantage. And with the "console trinity" under the AMD belt, that means they can deliver a new API to help bring gamers (their core consumer market) to Linux in order to better replace DirectX (which is Microsoft Windows exclusive, for those who don't know and might have been living under a rock in Bikini Bottom for the past decade).

This is where AMD Mantle comes in. AMD Mantle means that Game Devs can get better performance, and it'll be more stable across different platforms (such as consoles and PCs), which means less time developing, which means cheaper games to develop (which means lower sale prices, or more profit... no points for guessing what EA execs will pick).

But AMD made Mantle an open thing... why? Well, first, it'll be great marketing down the road, which is a great long-term strategy for AMD to help boost sales. It also gives them a great short-term advantage, because now nVidia will have to figure out a way to optimize their drivers to work with Mantle... but AMD made it, so they might have a slight advantage, and this has probably been in their skunkworks for a while now as well, meaning there's probably a lot of optimization that's been going on behind the scenes... but there's another, much more important reason:

AMD wants to attract developers to Mantle. If they made it closed-platform, it would make companies feel locked in, which is bad. It's the opposite of the CUDA philosophy, and it's one of the reasons Adobe went with OpenCL (and other content creation softwares as well). By making Mantle open, they've created something very nice: a platform for developers, especially Game Developers. This means AMD now has a whole eco-system for GameDevs to work on that's fully-featured and ready for console development and PC development. It means less work, less time wasted, and it'll be faster (hopefully). By attracting more developers to use their technologies, it means more developers will optimize their code for AMD products, which gives AMD better performance in benchmarks and (more importantly) real-world usage, which will have long-term boost in performance and (consequently) sales (which will be driven by marketing pamphlets featuring big graphs with benchmarks results of AMD products beating the competition),

With their TrueAudio technology, they can now make games more immersive. Sure, it doesn't sound like much, but it'll be that extra little touch PC gamers and PC Hardware enthusiasts will really like, not to mention those audiophile guys (stay away from my music! keep your ears to yourself! mute means mute!... - I had to get that out of my system, sorry guys) It's also a great way to take some of the burden off of the less-powerful single-threaded performance of AMD CPU cores, by passing said burden to the GPU.

With this, now AMD has a technology which is (hopefully!) better than DirectX. AMD seems to want to make Mantle a technology which can improve performance for multi-core CPUs (like those glorified 8-core console APUs that the PS4 and XBox One are using). This means AMD FX CPUs and AMD GPUs are now going to have a slight advantage.

And by supporting SteamOS from the get-go, we'll soon see a lot of the performance bottleneck from Windows go away. This might even (at some point) make 4K gaming at 60fps (or more!) possible using an R9 290X, or maybe even an R9 280X !

4K gaming might be a very, very interesting thing indeed. We haven't been able to go that route for some time because right now we haven't had standards which were 4K compatible, whether they were cables, VESA standards or monitors. However now we have VESA 1.3, HDMI 2.0, and with the new VESA standard we'll also see automatic 4K detection and configuration to it'll be transparent to the end user.

4K gaming won't come to consoles anytime soon, though, because the consoles just don't have the horsepower and they also don't have HDMI 2.0 or DisplayPort 1.2 connectors available. That means that Microsoft and Sony would have to release updated consoles. That's not to mention the performance drop in framerate and frame latencies if we did see them jump to 4K by putting on new connectors without increasing the amount of GPU horsepower on their consoles.

AMD decided to do what they did for a very precise, calculated, clinical reason. They saw Windows bottlenecking the performance. Many games weren't going to appear on PCs because of the development cost, now and in the future. And lastly, they also saw the drop in sales of desktops due to Windows 8 and MetroUI/ModernUI. After all that, going to Linux might have helped them, but if Linux had nothing to offer gamers and users in terms of games and/or software, it would have been a tremendous flop (like the new Blackberry OS, which despite being a great mobile OS, it had almost no apps due to almost no userbase).

So Valve and AMD helped scratch eachother's back. Valve made an OS that was meant for AMD's core audience, and AMD made drivers and the development platform needed for an en-masse migration to Linux, in an effort to increase Desktop PC sales and game sales. And it seems very promising, at least so far.

I'm hopeful this strategy works out. AMD has put a lot of effort into Mantle, and TrueAudio seems like a good idea.

If game engines decide to support Mantle en-masse (like Frostbite has already done, Unity from what I've heard, and others), this might be a very interesting thing. And if we see OpenCL middleware for things in-game Physics, we might also see a lot of improvements in gaming performance.

Where AMD goes from here is anyone's guess. AMD seems to be holding out on developing new GPUs right now because it just needs to have a product to compete with the GTX Titan and the GTX 780 for a more reasonable price. If it can do so for the meantime and hold out until 22nm and 20nm are here (in Q2 2014 from what I've heard, or maybe Q3 2014), then we'll see AMD come out with new APUs, new GPUs, and maybe even new CPUs.

Here's what I think Intel and/or nVidia might prepare as a counter-offensive measure. First, Intel might rethink a lot of their strategies of keeping us stuck at four cores forever. If AMD can put enough pressure to force Intel to make that change and add more cores to their mainstream line of CPUs, it might take away from their Workstation buyers, which might hurt them in terms of profitability (even if it's only a dent). nVidia might try to entice game developers and game publishers by trying to give them millions of dollars for game deals (since AMD has been known to offer great game deals thus far) for nVidia exclusive giveaway codes, or they might even give game developers or game publishers money to only let nVidia optimize their drivers before the game is released... or possible to get the coders to put in-code optimizations for nVidia-based GPUs. This has been well-known for some time now, and we've seen it time and time again.

Now to mention that Sony and Microsoft feel betrayed by letting PC users get access to AMD's Mantle technology, too, and it will be interesting to see whether Sony or Microsoft will allow GameDevs to use Mantle or not in their own proprietary, console-specific OS's.

What intrigues me is that right now most AMD GPU and CPU users are running Windows, which begs the question: how will Microsoft react to Mantle, SteamOS and AMD's new-found alternative to Windows? Will Microsoft play dirty or not? Only time will tell. But we do live in interesting times, at least for news regarding PC Hardware and PC Software right now anyways.

Anyways, that's all for now. Did I forget anything (except the references and links; don't bother asking me for those, it'd take a bajillion-and-a-half milliseconds for me to look that up and who has the time anyways?... pay no attention to the man behind the screen who just typed a metric butt-ton of text).

Sorry for the colossal text, guys. But in the words of eternal bard Shakespear: "To be, or not to be? tl;dr yolo"

** UPDATE ** : AMD recently announced that Mantle is *not* going to work with XBox One. It's meant for PCs, but since AMD has not announced whether or not Mantle works with Sony's PS4, we might just get a surprise, although I wouldn't hold my breath.

http://www.techpowerup.com/192549/xbox-one-doesnt-support-amd-mantle-api.html

http://www.techpowerup.com/192552/amd-explains-why-mantle-doesnt-work-on-xbox-one.html

great article.  I think you hit every point that everyone has been thinking about next gen gaming.  Personally i believe that if Microsoft is smart, they'll let mantle on their consoles, mainly because SteamOS is not going to be ready for the masses mainly because of game dev software being made mostly for mac and windows.  I don't think that linux will have the backing for another 4-6 years. so Microsoft will probably go with the flow.  Sony on the other hand has t big advantage on next gen consoles so they'll go with the flow most likely with Mantle since it'll be to their advantage.  Console for the moment are still more convenient than PC mainly because of the users not wanting to spend the time to actually buy good part and instead spend double the amount for a prebuilt PC which is not worth it. That's mostly at fault with computer manufacturers not really marketing to the right people and to teach users how to build PC's even though it's not difficult; but for many it is and it's quite confusing.  Long story short, i feel like next gen AMD products will be quite successful in the market and will bring AMD back in competition with Intel and NVidia.

Thanks, dude. First comment after a few days. I felt like I'd never get a comment on this article, even though I thought it was rather good.

Personally, I think Microsoft is going to play dirty and try to stop Mantle at an OS level for the XBox One, but not Windows. I think they'll wait and see to make sure Playstation isn't going to have a huge advantage by using it, because if PS4 runs much better because of Mantle and their GPU, and Microsoft refuses to allow Mantle, it'll keep games from running as good as they could, and given that more PS4 consoles have sold than XBones, I think we'll see more GameDevs working with Mantle rather than DirectX11 (assuming it does catch on with nVidia and Sony does allow it).

This is an article I just read: http://www.polygon.com/2013/10/12/4826190/linux-only-needs-one-killer-game-to-explode-says-battlefield-director

It says Linux only needs one AAA big game to "explode", and become popular. Remember that the Frostbite Engine is going to be used in a tone of games. Mirror's Edge 2, Star Wars: Battlefront, Call of Duty: Ghosts, Battlefield 4, a possible new Mass Effect game... if there is support and Mantle makes it run better and faster, then nVidia is going to have to come up with a better alternative, or adopt it... or take the beating in the benchmarks.

I think Microsoft might try to play dirty even in Windows, but not at first. If it can stop it before it's a big thing, it saves itself trouble in the long run. Microsoft knows that, that's why they've made sure to create the DOCX format to keep Office documents as incompatible as possible with other softwares.

With Mantle, if we see Qualcomm and Samsung, and other ARM companies adopting it, game development will be very easy from an API standpoint. The x86 code will still have to be ported to ARM/RISC, but that's still less work than having to get something to work with a new API and code altogether.

I hope AMD can come up with some great new stuff. Anyways, thanks for the feedback. =)

* Edit: Other games for Frostbite 3 include: Command and Conquer, Need for Speed: Rivals, Plants Vs. Zombies: Garden Warfare, and Dragon Age: Inquisition.

I think AMD is doing everything right, in the effort to attract developers and more market share. But, Nvidia has its own APIs, and it isn't going to let Mantle go unanswered. We simply do not know how much of an improvement Mantle offers. Nvidia users are not at a loss, certainly not with concerns to Battlefield 4.

What an interesting article. Great read. Thanks! I really have nothing more to say past that as I totally agree with what you are saying. Gaming totally looks like it will be heading into Linux soon enough or at least the steam OS and hopefully Mantle will boost all of that out. It's very true though that neither Intel, nVidia, Sony or Microsoft are just going to let these things go unanswered. So it will be interesting to see what the on coming "battle" as it may be will play out. I think AMD has the advantage in the innovation they are showing against the others, but they will be running an uphill battle in matters of the companies they will be going up against. Hopefully in the end, it will all benefit us all. I'm just curious to know how long this I scratch your back you scratch my back between AMD and Steam in terms of the Linux support, Mantle and the Steam OS.

I absolutely agree. I would hate to see my favourite games divided by this jealous rivalry.

I believe AMD have shot down the rumor that Mantle is open source stating that it only works with the GCN architecture so it is not likely Nvidia will be able to adopt it.

Nvidia users definately don't need to be at a loss since Nvidia could simply embrace Mantle, especially if it turns out to be better than DirectX (which I think isn't too hard). Since it's open Nvidia can profit as much or at least almost as much from Mantle as AMD. If they're going to be a bastard though, they'll invent their own version and keep it closed, thus attempting to keep the gaming world seperated as they have don't with their bitch move with PhysX (yes it was a bitch move). I'm not sure (haven't read the entire article tbh, though it was interesting and good material from the parts that I did read, just don't want to spend that much time reading right now :p) if Mantle is actually open source or not (or if it's going to be at some point) but if it is (or will be) there is nothing stopping Nvidia from participating and getting it optimized for their cards as well.

It'd be in Nvidia's best interest to play along, it'd save money and time, money they don't have to bill their customers for. Means happy customers, their cards work "just as well" with Mantle as AMD's and their not getting billed (too much) extra on top of what they're already paying.

Mantle using multicore whenever possible even if the game isn't necessarily multithreaded is only a good thing so Nvidia better get along. Creating something similar would be good for Nvidia customers, but would probably end up in the same crap that PhysX brought. Nobody needs 3 alternatives for DirectX. OpenGL seems like it can't keep up with DirectX (and at least hasn't been able to do so in the past), so here's hoping Mantle can, and hopefully even beats DirectX so we get something better.

This is chess. Lots and lots of chess...

(or turnbased strategy gaming)

Thanks. It's great that one of my first serious articles (first blog post here on the Tek) is getting such good feedback. I was worried for a while that nobody figured it was any good, but it seems over the past few hours the article has got a lot of good feedback. Makes me feel like I should Google search the references I made throughout the article and put them in there. =P

I do hope AMD and Steam keep working together. They both seem to get along nicely. And remember, Valve has the biggest gaming distribution network around. So Microsoft could stick their thumb towards AMD and Steam, but that would only make more gamers hate Microsoft and put more fuel in the fire. Also, it would be anti-competitive, and given lawsuits in the US and EU, it's unlikely Microsoft would take such a bold move with such potential to backfire.

AMD is the *only* company right now that could have pulled this off. It supplies the hardware for all three major consoles, it also has heavy tied with gaming, it has heavily-multi-core CPUs (even though they're very outdated, and their chipset is past-proof), and it also has pretty good Discrete Compute (OpenCL) and good graphics (and drivers). That all together meant they could create a good alternative to really make the most out of their GPUs. And fortunately, they did so, by creating an alternative... however, it's being held back by Windows and DirectX, but not for long.

With AMD and Valve thinking that SteamOS and Mantle can together overcome the bottleneck of Windows, we'll see a new age of computing. Heck, with SteamOS I just hope they throw away 32-bit altogether and just make 64-bit the norm. That'd be a heck of a better alternative than allowing 32-bit compatibility, and it'd also allow for better driver compatibility, more memory usage, and it'd also give GameDevs a reason to translate games from Windows (x86 32-bit compatible) to fully 64-bit enabled. =)

Game theory!

Is this a suitable pun? Chess related/business strategy.

Interestingly nvidia is going to include ARM processors inside its GPUs from maxwell onwards, they acquired a company which has some code to convert x86 instructions to arm instructions but got in legal trouble with intel. Will be interesting to see what AMD does here.

I hope you can go back and finish reading the article. =) I put a lot of effort and time into it. (I didn't have the links because this was just from memory... I read a lot of tech articles and news.)

I absolutely agree. PhysX is used in very few games, and APEX as well. No matter how good they are, they'll still proprietary, and making games run worse on other systems isn't good for business. It makes gamers feel left out that they can't get the most out of their system because they don't have the right GPU, and few gamers actually bother to have two high-end desktops or two high-end GPUs laying around to play a game that runs better on one card versus another.

And when this happens, gamers often choose to not buy that game, because they get the "feeling" that the game won't look as good or run as well if they aren't running a certain kind of GPU. That gives the gamer the feeling that on his system that particular game will run as a "second class game" even though it might be a AAA title with stellar graphics! In the end, the gamer might feel that since he can't run the full game because of his hardware, he might feel like the game he'll end up playing is somewhat less than what it should be. Which is a negative customer experience, and that makes gamers feel worse about their purchase. A bad, very bad sales strategy.

If I buy a AAA game, and I hear that I'm not getting 20% of the effects because my hardware isn't from the right brand, I'll feel bad and somewhat jealous that my system isn't running the full-fledged game. And I'll associate that feeling of being "left out" with that game and that publisher. Ultimately, when I look at another game from that franchise, I might feel bad inside (even though I might not even remember why). This is of course more psychology and customer experience, but I've seen this happen before with gamer friends and people at game stores, and even hardware stores (when talking to salesmen).

This is just a bad strategy. When you make other customers feel "left out" even though they bought the full game, you're just making them feel worse about their purchase. It's a self-defeating strategy. And that can be seen in the nVidia PhysX-enabled games list, which is VERY short.

If nVidia is smart (and they are), and if Mantle is as good for performance as promised (and I hope it is!), than nVidia will start to immediately make their drivers with (some) Mantle compatibility. Probably they'll make it sort of buggy at first, and they'll maybe try to find ways to claim that Mantle is buggy, as to kill it earlier. Maybe nVidia might want to create something closed-source. But nVidia could only do so well in a Windows environment - in Linux, there's no way closed-source would work well, especially with Steam backing AMD's idea of Open-ness.

Given that, there's a good possibility we'll see nVidia wait a while before they make their next move.

nVidia to AMD in the chess match: "Your move."

Well, not sure. Because x86 isn't something only Intel has access to. Remember, AMD uses x86 instructions too. Also worth noting is that nVidia could just convert the x86 instructions into ARM instructions using their driver, and do it that way. After all, if compilers can do so for free, why couldn't drivers do so from a software perspective within an OS?

x86 on hardware may be Intel's domain, but an active software conversion (even though it might be less effective) would still be as legal as a compiler. Also worth noting, the conversion could be done on either an AMD or an Intel CPU, and thus since it would be running ARM instructions in their hardware, the ARM license would apply there.

Seems like at the moment it only works with AMD cards but that there is speculation that "other vendors" might be able to implement it too. I really hope this'll be the case, and soon. Otherwise it's AMD this time dividing the game(r)s. And it'd probably mean there'll be less games looking to use it.

 http://en.wikipedia.org/wiki/List_of_games_with_hardware-accelerated_PhysX_support

Long enough for me. And there's some good and big titles in there.

Anyway it's not just PhysX. I've been running on old hardware and whenever the Nvidia logo showed up in a games startup sequence, I knew it wasn't going to run very well (unless the game was old enough). I don't know if Nvidia users have the same issue with games that have the AMD Gaming Evolved logo. If they do it's equally bad (though right now there's not that many games that do :p) Every time I ran a game for the first time and the Nvidia logo showed up a sigh escaped. I hope that Nvidia users don't have the same with the AMD logo.

To be honest I rather have game developers put enough time and effort in optimizing a game to run better at all platforms, even if they spend a bit more time in optimizing for a specific one. For some games general optimization would already be a miracle (GTA 4 anyone?). From the sound of it, Mantle would solve (at least some of) that. So I really hope Nvidia will be able to play along and when they are able, that they're actually doing it too. Gamers everywhere can only be so lucky :)

Closed source in Linux works just as well. But most companies just don't went to spend too much manpower, if any, in such a small market. If they'd wanted to, they could make good proprietary drivers, and still have a license that allows redistributing them. Redistribution of proprietary is one of the issues in Linux and one that can be solved by proper licensing. It's not like they're selling the drivers so I don't quite understand why they shouldn't be allowed to be redistributed. It's only a matter of installing them through the package manager after the OS is installed so I don't see the big deal.

http://www.geforce.com/games-applications/technology/physx?title=&sort_bef_combine=title+ASC

I guess. Well, according to nVidia's official list (this is via GPU-enhanced acceleration, not processor-enhanced accelerations), the list of supported games is shorter.

Among the list, only Hawken, Metro 2033, Metro:LL, Mirror's Edge, Planetside 2, the Batman Arkham game series, Call of Duty: Ghosts, and Borderlands 2 intrigue me. The other games either aren't appealing, or are too old.

I think if nVidia made a version of APEX and PhysX than ran on OpenCL, even if it was horribly-written (but functional) and did give nVidia gamers a better experience by a little, it'd still be a more viable technology (in my opinion).

I think closed-source for Linux kind of goes against what it's meant for. Also, given that nVidia would have to make drivers for Linux when SteamOS does come out, AMD is going to have better drivers out of the gate - and the "X-company has better drivers than Y-company" forum flame wars will once again resume. *sigh of disappointment*

Linux won't be such a small market. Not a market nVidia could afford to ignore, anyways. Not if/when SteamOS comes out. If the benchmarks are REALLY good, and Mantle is amazing, nVidia will want in on that action because it'll mean more sales. Remember all the woes with AMD's CrossFire and Micro-Stuttering and the endless debating and whining going on from both sides? That kind of thing will happen again if nVidia drops the ball with Linux drivers. It's bad press and bad marketing to let your competition get an edge on you, especially given all the trouble nVidia has been going through lately to become #1 in GPU sales.

If nVidia doesn't support SteamOS well with good drivers, it'll be ignoring a huge part of the market. And the time necessary to get back into that market might cause damage which could take a long time to repair, if it isn't irreparable. The bad press a GPU company can get from bad drivers is all too well-known from AMD. Think of that, but with the tables turned against nVidia: THAT'S why nVidia needs to support Mantle and SteamOS if it doesn't want to drop the ball, big-time. Given that DirectX 11 isn't going to be on SteamOS anytime soon, it's likely to assume that nVidia would have to come up with an alternative, or provide a VERY good experience on OpenGL and hope GameDevs support OpenGL just so nVidia GPUs can run on Linux (but that doesn't seem to be the direction everyone is heading towards).

Also worth noting, AMD is out of the high-end Desktop CPU market. Their 900$, 8-core 5Ghz CPU was a total sales flop. And the FX series of CPUs has been pretty much abandoned completely. AMD is in no position to compete with Intel in Single-Threaded performance, and it's heat output is way too high, and it doesn't overclock as good (well, except Haswell, because I have a sock that overclocks better than a Haswell by putting it in my clothes dyer with a digital watch - I'm a very punny guy sometimes).

So AMD is hoping that by providing meaningful solutions to their core market (gamers on APUs and gamers on GPUs), they can hopefully take back some of the market for CPUs due to more cores, and they might hopefully take back more of the GPU market lost to nVidia. This is their last-ditch effort, and they're betting a lot of chips on this solution, you can bet. AMD has never gone to a lot of trouble to do much marketing; but look at the Hawaii event. Look at the cards sent to reviewers, their declaration of Mantle, their willingness to sell at barely any profit the APUs for WiiU, PS4 and XBox One. They've got to be betting a lot of chips on this, because this has got to be part of some long-term strategy of theirs.

Mantle is supposed to help more cores, and it's supposed to be made to run on GCN architecture. Now, given it's an API, it's relatively safe to assume it'll be out soon so everyone can start using it. But given that Maxwell won't be out until 6 months later, I don't think nVidia could change Maxwell to use Mantle very well - a cunning strategy if I've ever heard one.

However, if Mantle is an Open API, that means other companies can make their drivers provide Mantle support. However, it's safe to assume Mantle will be pretty much custom-made for AMD GCN architecture from the start, and will evolve from there, giving AMD a certain advantage here.

Also, if AMD is right in saying Mantle is "Open", AMD might control how the API standards are implemented, but other GPU/CPU manufacturers might be able to write their own drivers for Mantle to support it. If that is the case, it allows AMD to control the direction and version of Mantle, to give it a bit of an edge, without actually keeping it as a proprietary API. If AMD is *only* allowing it to be proprietary, and making it so it *cannot* run on anything other than GCN architecture GPUs, than we've got a huge problem, because now it'll be just another glorified PhysX, APEX or (worse yet) Glide. Which is why if AMD wishes to create a platform and become a lead innovator in the field (and have it's AMD logo whenever a game has Mantle running in it's engine's back end), it'll need to make it Open. If it keeps it from everyone else, it just means GameDevs won't use it, because it means their audience to buy the game will be smaller, meaning less sales, less profit, gamers feeling excluded, etc. A bad thing to associate with your game franchise and/or publisher brand.

An OpenCL implementation would benefit AMD a lot more than Nvidia since Nvidia is horrible at OpenCL. So very unlikely. Even then Nvidia would probably just make a different PhysX that worked better on their own cards still, so the gap would still be there :p But that's just speculation as they won't change PhysX.

Closed source might go against what Linux stands for but is not excluded from Linux. Ubuntu and some others come with proprietary drivers/software even if it goes against those drivers/software's license. Which is illegal (right?). Nvidia has had the edge in Linux drivers for long enough, and it's not like they can't do it, at the moment it's just that they won't do it. Gaming on Linux won't take off for at least a few years, SteamOS or not, because there's a whole bunch of older games still being played that would require dualbooting or virtualization. And dualbooting is a bitch if it's "not needed". I know I won't be dualbooting for gaming. So they have time to get things in order, in the meantime, people with Nvidia cards will simply stick with Windows or take the punishment. Especially the fanboys, what, get another brand because of whatever? Hell noes. (Which, btw, is fine, they're fanboys and allowed to be. That it's silly come upgrade time doesn't really matter)

Linux as a gaming platform will have to grow and it won't go superfast. So both sides have plenty of time even if AMD might have the edge for some time.

I doubt DirectX will ever be on Linux. The amount of porting needed is most likely ridiculous. And if it does come to Linux it most likely won't be DirectX 11 and might not support older versions as a result of porting work, meaning only newer games would be able to run. Which would sort of mean that the same games could as well be using Mantle (and then there will be a couple more games available). Although that is yet again speculation.

The FX9xxx series was never meant to be mass sold. They were cherry picked FX8350's (well FX8xxx anyway) so there wasn't an abundance of them anyway. They weren't meant to be bought by the average Joe either (which is why the price can be that high) but more for enthousiasts.

Putting such a high importance on singlethreaded performance when the most tasking thing you do is gaming, is, quite frankly, ridiculous. Or, as trends suggest, if you're an editor (rofl) the same applies (get proper software if you're still stuck to singlethreaded software ffs). And I don't know about others, but the times I'm using a singlethreaded program that isn't a game, pretty much any modern CPU will do. For heat output there's plenty of better coolers than stock (the FX9xxx didn't even come with stock cooler because you were supposed to use something GOOD) but if you didn't research that before it was too late I can understand frustration. And yes, it's not good when with a stock cooler at stock clocks still runs too hot.

If you got a good motherboard and decent cooling you can clock the FX8350 to 4.8ghz (from 4ghz) so that doesn't sound bad for 8 cores (sure it doesn't help singlethreaded that much but you've probably read what I just said about that).

If you've been looking around several "tech" sites, you probably couldn't help but noticing that many AMD reviews, be it cpu or gpu, are not that positive, have weird benchmarks, are largely ignored, or have wrong setups and conclusions. I mean dear lord, it's so bad that a cpu benchmarking site had to bump up the 8350 couple of spots to reflect reality, suddenly it beat several i5's and i7's that it previously "couldn't". Or reviews with benchmarks that show them being as good on some, better on others, worse on yet other others, resulting in it pretty much being equal if you don't have a certain game preference from the benchmarked ones, ending with the conclusion "but Intel/Nvidia is a lot better" for no particular reason. Many tech sites have a clear preference for Intel and/or Nvidia. Pretty much all premade pc's have a clear preference for Intvidia combo's (probably part due to being paid, part due to ads, and part due to bigger margins possibly). It's ridiculous how for many people AMD is simply not an option, showing either fanboyism or a lack of knowledge. Even in serious budget builds Intvidia is, or used to be, strongly represented. Which for budget builds makes absolutely no sense unless you got a damn good reason, which most of them don't. Suddenly power consumption is a huge deal too. AMD has cards out now that beat a titan for less than half the price (based on launchprice which has lasted long enough) but hey, they suck because they use more power, nevermind that they're half the price. Sure, Nvidia uses less power. They also throttle at 60°C. Not that you'd notice a hell of a difference, I'm just saying, apparently throttling a bit is acceptable (less performance is acceptable? Then why do you buy a 1000$ card?) but higher power usage isn't.

The reviews and benchmarks might not be so tampered with, or biased, with the new AMD gpu's, but before these, it definately happened. Among others, we've discovered that Tom's hardware and Anandtech can be completely ignored as they don't seem to have the first clue about what they're doing (proving several times each). And let those 2 be the most linked and easily found ones.

The 280(X) was already beating the Titan (from the things I've seen) or at least came very close to it, and yet so many people were afraid of the 290(X) and had a lot of "but we're waiting for benchmarks" out of fright that they're brand got beaten to a pulp. It just came across as a cowards line.

But I went a ranty way, let's move on.

AMD will definitely have an advantage in Mantle but that was to be expected if they're the ones building it. And that's ok, as long as Nvidia can start using it soon and well.

I think there might be a difference between something being "open" and being "open source". To me just "open" sounds like anyone (gamedevs) will be able to use it, no or very low licensing cost. While open source would be better as then everyone can help improve it and Nvidia would get the chance to get their own optimizations in there. Writing their own drivers wouldn't be a problem if it was just "open" (as in my understanding of it), but the API itself wouldn't be optimized for it.

All an API (basically) is, is an interface, so in theory it *could* work without requiring specific or adjusted drivers (depending on the code behind it). If my understanding of an API is correct enough.

Gamedevs are already excluding gamers for what could be those exact reasons. I've been wanting to play Gran Turismo on pc (without emulating) for a long time now (since GT2 on psx). Console exclusives are doing exactly that, excluding gamers based on platform/hardware. (But they might just be getting a lot of money for being exclusive in which case the reasons aren't quite the same, I don't know what the case here is, but anyway, gamedevs aren't always scared of excluding gamers if the compensation or reasons are good enough for them). Now, excluding an entire group because of only using a different gpu would be bad, but if porting next gen console games to pc will be a cakewalk, and if they use mantle, they might just look at pc version sales as an extra bonus on what otherwise could just as well be a console exclusive. With games being available by downloads that could happen, as you eliminate the whole cost and process of burning them to discs, packaging, redistributing to stores, etc.

In the end it's way too early to be able to say anything for sure. Lots of speculations, ideas, and opinions going on. But reality is it's a lot of wait and see right now. Just thought I'd share my views and beliefs.

 (btw, I read the rest of your article as well by now)

 

 

 

Oh. And if 1 Titan fails, just get a second.

(There's 1 person on this forum that'll get it, the rest will probably flame about it :p)

Funny. AMD is/was planning on putting an ARM processor in their APU.

Oo

(I never did find out what the purpose of that would've been but my guess it was for low power states when idle)

Microsoft has thrown the towel in the ring when it comes too technological innovation decades ago, but they have made a quasi-monopoly with legacy under-performing software since then nonetheless. OpenGL was a standard before DirectX, has always been a better choice than DirectX from a technical point-of-view, because DirectX is just a marketing name to answer to Apple's "Core", but was never an innovative technology, but it was closed source and thus fit for extorting money out of the market. Microsoft is not in trouble, they can buy a lot of companies cash without financing, because they have a huge treasure chest, and they still make more money for less investments and overhead every single day.

Intel and AMD and nVidia don't have any different strategy than Microsoft, they only support Open Source/Linux when it's necessary for them in order to sell their hardware, and they would rather not support any standards that enable technological competition at all, if they could, they would lock down their technology 100% tight if it meant making a dollar more. Intel has been supporting open source much longer and much more than AMD ever has, because that's what it took to conquer the enterprise and government markets. The same goes for IBM and others. Open Source is an economic model in the first place, allowing companies to allocate more assets than they have money to buy. It works, but it also stimulates competition and historically, in the technology sector, every technology marketing phase starts with open source technology, that is then closed down and rebranded to lock the market, which coincides with a market boom because companies will invest in marketing for products that are not open source, not in marketing for products that are open source, and marketing sells stuff to the consumer market, not technology. The big corporation market strategy supersedes the level of hardware technology or software standards. They shape the market in a much more invasive manner: they shape the legal principles of a society (that also start out as open source principles, like a declaration of rights, and but themselves a superimposed legal system upon that open source layer that makes it beneficial for companies and less accessible for ordinary citizens).

Right now, the market is saturated. Making open source technology a thing in marketing strategy, is just a trick to squeeze the last drops of juices out of the already legacy PC market. Fact is, the PC as we know it is dead. It's blatantly obvious: hardware manufacturers are not making any money. Asus, Acer, HP, etc... are in serious trouble, and they are doing everything they can, including some very extensive hot air marketing, to sell their products with any kind of profit.

It's getting too hard to please shareholders with proceeds from selling stuff on the PC market. The consumer PC market is saturated, and the technology is too expensive.

AMD's Mantle API is too little too late, it won't make a difference. Why? Because the developers aren't developing for the PC anymore, they're focusing on the post-PC market.

There is a limit to everything, you can't sing the same song indefinitely. Why do laptops still have low resolution screens? Because there is no money in making laptops with high resolution screens. Sony learned this 10 years ago when they made the first super small x86 laptops with a high resolution screenn. They thought that they had made the ultimate product for professionals, guess what, you can convince 12-40 year olds that they need high pixel density, but not the age groups that decide where the money goes, because they can't see the benefit of high pixel density anymore. Just like you can persuade 12-40 year olds that they need to spend thousands of bucks on disgusting sounding speaker systems and headphones that have a lot of bass, but you can't convince the age group that makes the real money decisions that there is any benefit at all in those. The truth is, you can't sell people any more big screens, everyone already has at least one, and they are not going to throw 500-5000 bucks out of the window after only a few years for the same big screen with more pixels they can't see anyway. But what you can do is provide people with an alternative that is more practical. An analogy from the Hi-Fi industry would be the Sony Walkman: everyone in the Western world already had a Hi-Fi set, and Sony made the concept of "personal audio" with the Walkman. That has to come for video also. The Google Glass Project is not a useful product as it stands, but it shows the way: get everyone their own screens. Of course that concept can't work for PC's, but it can work for devices. And noone needs 4k glasses, eye focus detection technology was developed by Japanese camera manufacturers in the 80's: with very few pixels, eye focus detection, and a sliding screen, you can offer much more effective pixels to people at exactly the pixel density they want. That technology has existed in open source software now for almost 30 years, it a standard feature in many linux operating systems, it's used by the military with great success since the 80's, it doesn't require faster, bigger and more expensive hardware, it works smoothly on a bloody Hercules graphics card from the 80's, it will most certainly work on a Mali GPU, in stereo 3D, with all the trimmings, with open source software.

Another thing is the distributed computing model. The market is evolving towards scalable hardware, add low cost low energy hardware to scale up to the performance level a user needs. The PC is just too expensive to run for day-to-day computing needs, because it uses too much power, even an Atom chip with integrated graphics still uses 10 times as much power as the acceptable power level for the devices of the future. But Intel will get it right soon, they are working like crazy on it. They are not working half as hard on the next PC-CPU, what they have now will suffice for the business PC users, those do not require any more power than the Intel chips have now, quite on the contrary, as more and more business users switch to linux, they need less power to work efficiently. Intel knows that AMD will be turning the light off, and they don't care, they have the interesting markets: the enterprises and the x86 devices technology, the Atom. Intel has a huge chance of becoming the next go-to platform on devices if they succeed in making the Atom CPUs more powerful, cheaper, and more power efficient, and Intel has always had full open source compatibility before anything else, and they already have the operating system for it with Tizen, which is OWNED by the linux foundation, and developed and funded by Intel and Samsung.

AMD "winning" all console platforms is a Pyrrhic victory, the only reason why they won is because they would settle for a very low profit per part. There is few benefit in that for AMD in the long run, because for PC hardware manufacturers, the future is confined to the enterprise market, because that's still a very real market, also in decline, but the decline is much slower, and there is still a long term market for the PC there. But the AMD-consoles are the last flare before the light goes out.

History always repeats itself: how did the PC become the computing standard: because Intel made hardware, and offered an open operating system to go with it. IBM made it commercial and locked it down, and invested in commercial applications for it that made it more accessible to non-tech-savvy consumers, with things like BASIC (Bill Gates' first big contract for IBM, the only one he actually developed himself instead of buying it from a third party), or things like DOS, developed by a small Seattle company, bought by Bill Gates for 25000 USD, and hyped up by IBM.

Intel is doing exactly that all over again. They will make the CPU that complies with all the open source standards, so that it will work (their next gen Atom graphics drivers have already been partly submitted to the Linux Foundation for merging in the next kernel) on available open source software. Then Samsung and the future other members of the Tizen Development group, will market commercial software on it, because Tizen is Apache 2.0 license, it's open source and owned by the Linux Foundation, but companies can develop closed software layers to put on top of it, and those that are part of the Tizen development group, will be able to use the resources it takes to bring their hardware drivers to the linux kernel so that they can market efficiently. The possibility of being able to sell proprietary software for it is absolutely necessary, because only a minority of people will invest the time and effort to learn to use open source software, and the power of the open source software isn't something that anyone would be willing to put at the disposal of the majority of users, so there has to be a proprietary system that can be marketed to reimburse for the costs of the open source software development (it has always been like that: those that buy closed source software pay the bills of the development of the much more powerful but less accessible open source software, because that's part of the economic model).

Don't believe this, just look at where the money goes:

- IBM invests billions in linux development for big computing in enterprises, nothing in any development for consumer computing;

- Microsoft is divesting Windows, focusing on devices and services for devices;

- Google is cancelling ever more "desktop only" products in favour of "made for devices" services;

- nVidia has the money to invest in next-gen hardware development, but they are divesting in their PC products that are not part of the enterprise range, and is investing heavily in the devices platform (but they are also too late to make a difference outside of the US because development of SoCs in China is better suited for the transitional platform, as it is more compatible and cheaper, and after the dust settles, Intel will come out on top anyways);

- Intel has thrashed development of an entire PC-CPU-generation, in favour of investing in development of their next-gen Atom platform (which will destroy anything else in a couple of years time);

- DICE has publicly expressed their wish to migrate to the linux platform for their Frostbite engine, even though they already have AMD Mantle integrated. The reason why they want to do this is not because linux-PC-users buy more games than commercial platform users, because they're certainly not, but because devices run on linux;

- etc...

So I don't care much about PC products anymore, the future is somewhere else. The Sony Walkman conquered the market with crappy headphones at a time where everybody in the West had pretty good Hi-Fi sets, so the display glasses that will kill the PC market completely won't even be that good, they just have to be small enough, and that's something that can be realized in less than 2 years time. It took 20 years for portable headphones that are even comparable to the old Hi-Fi equipment to come to portable Hi-Fi devices, it will take just as long for wearable display devices to evolve to anything near 4k display technology, but that won't hold down the success of it.

People also use their PC for ever less stuff. Everyone used to love their PC because it had huge storage, now everyone is moving towards ARM-based network storage solutions and USB-storage. Everyone used to love their PC because it had a lot of smart applications, like groupware suites, instant messaging, it was an information aggregator, etc... now the phone is the primary groupware instrument, searches and messages are shifting towards voice-based rather than keyboard-based formats, and are much more capable of efficient communication and information aggregation than PCs. Even more "serious" applications, like image post-processing for instance: yes the PC is still much more powerful, but it is also part of a workflow that is disappearing, because the input instruments are not as popular anymore: DSLR- and Cine-cameras. People prefer a high performance portable solution, and those are often very much designed to integrate in a devices-workflow (or be part of a device) than in a PC-workflow. Steam's new controller is entirely touch-based, not because it works better than mechanical controls, because it doesn't, but because a lot of the customer base has migrated to touch controls on devices, and that's where the future lies. SteamOS is based on linux, not because it performs better than Windows on a PC, because Valve couldn't care less, but because devices run on linux.

So yeah, AMD is a good choice right now, because it costs less, and does the same or more than more expensive solutions, for nVidia or from Intel, but don't expect too much in the long run of any PC product, because the future is elsewhere.