Isn't AMD a no brainer choice because of Mantle?

Mantle is actually quite simpler than it sounds to implement.  UDK is in talks to adopt it and Unity might adopt Mantle with Unity 5 which will probably be around by the end of the year.  Cryengine is a no brainer; they tend to care about the best performance on their engine, which by far dominates UDK on performance, so it's a given they'd adopt mantle, especially since it'll be an easy port considering they focus console gaming lately.

I'm not sure if they specified the hardware. If it was on an APU then I believe my point still stands. Top end GPUs might only get a minuscule gain. There's nothing to say otherwise.

Language used in the demonstration would lead me to believe that AMD used the best configuration to sell Mantle as best possible. "Up to..." 40% performance improvement. It's not always going to be 40%. They should have shown a range of hardware.

It's not that I don't want to see those kinds of increases. In the long-term, we will all get better performance from each generation of hardware. The software side of things is currently holding us back.

At least Mantle is a nicer thing than G-Sync which is just as over hyped.  I still don't understand the point a little screen tearing is not that big of a deal to spend an extra $200 on a monitor which will be irrelevant within the year because 4k is massively dropping down in price.  $1000 monitors already, by next CES they're around $500-600.  Then again G-Sync has dynamic refresh rate which is the only real nice part, but it's not that big of deal since many programs don't like dynamic refresh rates.  for development g-sync is quite useless...only for gaming while mantle later on may be implemented on other things. DirectX does power more than just games.

4k is irrelevant and will be irrelevant to gamers for a long time. Most people are not buying $300+ panels because most people don't even have the Gpu to power it. 1440p is still untouchable for most people because its price is still at $400+ for a mainstream panel. And quality 4k panels are still in the $3000 range. There is that dell everyone is hyped  about because its in the $1000 range. IT RUNS AT 30HZ PEOPLE. G-sync makes part of this possible. G-sync allows for low frame rates and a smooth performance at the same time. But there is no G-sync on the dell monitor. 4k is for editors and millionaires. They were using 2x Titans to power a single 4k monitor. So don't go out and say that 4k will be reasonable next year.    

Get the 280X. When you aren't using the PC the card will make majix internet money.

Until we see real numbers on mantle this is a pricing game.  The 780ti and R290 are very close,  The 780 ti inches out in most test but even that value is small.  If cost is your concern go for the 290, if you want g-sync, shadowplay, and shield streaming go with the 780ti.  

I personally bought a 780ti because I believe Nvidia is following a better path of improving the gameplay experience vs playing a numbers game.  Many people here knock g-sync but until you actually experience it you don't realize how much of a game changer it is.  

AMD has and as far as I can tell will always be a value winner, lots of power, and a good price.  Nvidia quality, drivers to customer support right now is better.  It's like the Mac vs PC comparison in my opinion.  Both do the job so its a personal preference of what you want out of it.

Also if your going to mine any cryptocurrencies AMD wins, no contest, the Nvidia cuda has to do 3 ops vs the AMD's single action for SHA hashing which results in about a 30-40% decrease in hash rates.  Nvidia does use less power, but not enough to make it more profitable.

http://www.youtube.com/watch?v=6PKxP30WxYM

here's a demo.

https://teksyndicate.com/users/roronoazoro/blog/2014/01/20/sync-wars-amd-nvidia-and-vesa

nvidia is just ripping people off. vblank is an industry standard that was ment to help reduce power usage on laptop screens. gsync is basically the same thing except it offloads the vblank calculations to the gsync module on the monitor. in a few quarters high quality monitors will have vblank enabled as an industry standard. and from what zoltan has said, screen tearing is a non issues on linux boxes. so im guessing they addressed screen tearing on the software level already. makes me question those internet reviewers that are all crazy about gsync, makes them seem like uninformed nvidia shills to me.

The hardware was a 7850k + 290x. They said on the last slide (ie the footnotes).

They couldn't have picked a better configuration to demonstrate it. A weak CPU and a high-end GPU to show the removal of the CPU overhead. That's a very positive thing.

To be fair; a game that truely takes advantage of PhysX's particle effects is beautiful!

show me show me show meeeeeeeeeee

Borderlands 2 does quite a good job

lol, very true.

I've had a little bit of a think about it. While it is impressive that this additional performance was extracted with lower-end hardware, this is probably the largest difference we will observe. Using a low-end CPU with the higher-end GPUs will obviously produce the largest performance gain. The CPU is the most inefficient part, not allowing the GPU to flex.

With something like an i5 and a 7790 (just an example) the GPU becomes the limiting factor. While there will surely be a gain, it wouldn't be anything like the demonstration given. I don't know if we should still expect something sizeable on a much more balanced system. e.g a 760k and a 7850

I doubt that question can be answered right now.

Mantle is apparently released right now (seems that some problems with drivers have delayed it though) and for what it is said, the performance increase does seem to be noticeable, specially when comparing hardware and their relative performance (7970 vs a 770 for example), however this does NOT mean that mantle will become a new standard API that every developer will adopt (because hey, physx/g-sync/shadowplay/streaming and a whole bunch of money). 

In theory, it should provide a nice set of features that say Directx simply cannot reach at the moment (being a high level API). If game developers were to adopt this API and include it within their games, then yeah, unless NVIDIA does something about it, the advantage would be substantial. 

As of now, AMD has a lot of stuff prepared that should (hopefully) set a much higher bar in the competition (dynamic, free vsync-Mantle-the ability to mine bitcoin/litecoin/whatever) but to say their products are a no brainer would be a bit too fast of a statement, as much as i'd like to say it. They've been out there ripping each other's eyes out for years; i'm pretty sure NVIDIA will respond somehow if Mantle kicks in.

TL;DR: Sorry, but i doubt it.