Star Citizen w/ mantle!

Yeah physx is basically the complete opposite, given that it implements new gaming content as opposed to being used to render the game content (which remains the same on all graphics engines) it is relegated to purely aesthetic purposes due to the proprietry nature of it.

Additionally the aquisition of physx (as opposed to the inhouse design of mantle) has held the industry back because it has locked out half the userbase from a hardware based physics system designed to implement new content into games. Content which has not been implemented in a non aesthetic basis because that would mean that the aproximately half of the user basis which doesn't have physx wouldn't be able to play it.

Additionally as KYLE stated physx is being used as an anti-competitive tool to lock out competition, I have seen reviews of some of the games that have physx where you see virtually no diffference except a drop of frame-rate on AMD systems.

Finally it seems lately that alot of people here have been bashing AMD in favour of NVIDIA.

I'm sorry... I was having a bad day at work and needed a ciggie... 

I'm not usually that much of a bitch, my apologies. 

Well, I'm a bit of a bitch, but usually in a good way ya' know? 

DON'T THINK YOU NEED TO APOLOGISE AT ALL.

The lack of positive comment does not make someone a fanboy. To be blunt, there's not much positive to say about nVidia, they're a sneaky slimy company that many people are growing tired of. To dislike nVidia is not to be an AMD fanboy, as people dislike nVidia based on things nVidia do themselves.

If that's in response to me saying fanboy, you haven't been in these forums for very long. Unless it is from AMD, some people in this forum don't want to hear about it. AMD deserves praise, and so does Nvidia. So I don't understand much of this hate surrounding Nvidia when AMD are fucking pathetic at times. The amount of apologists following the reference-locked 290x launch was stupendous. If a company cannot do anything wrong, you are a fanboy.

But that's an observation of mine. Nothing more. I just have grounds to say it whenever I happen to do so.

 
 

Can you point out where I said anything you've suggested I've said? You seem to be under the incorrect impression that being neutral means giving both equal praise. That's not really how it works, you give praise or criticism in response to actions taken, and the way it is now and the way it has been for a long time is that nVidia's actions attract more criticism than praise because of the slimy anti-consumer and anti-competitive things they get up to. If you knew the etymology behind nVidia's name, you'd probably understand why they are like this a bit better.

Just because nVidia do a load of crap doesn't mean AMD do no wrong, but that nVidia do more wrong and thus get a lot more attention and if any company deserves the title of pathetic, it's nVidia because pathetic is what they do best. I'd say the same about nVidia regardless because nVidia do this sort of stuff because they're nVidia, I don't see nVidia the way I do because of AMD.

I stopped supporting nVidia years ago because I was getting increasingly tired of all the underhanded, dodgy and slimy things they had involvement in, so tell me more about how that has anything to do with AMD as I'd love to know.

 

I did ask if it was in response. Since I couldn't tell by the tiered replies.

I agree. I understand why AMD is preferred by many. Nvidia can be bullies, in my view. It wouldn't stop me from using them, if they were much more suitable for my needs. While I don't have any need for G-sync, I still think it deserves some praise.

One thing that AMD has done wrong? Producing false information. The reported R9, the speeds of those cards were incorrect. Like the article posted about AMD trolling Nvidia's event, if you recall? While we all laughed, it was pretty underhand. It later emerged that the 780 was faster after overclocking. They could have done anything to make that bench an unfair test.

So pulling the wool over everyone's eyes is seemingly more acceptable than Nvidia trying to do the same thing? Bullshit.

 
 

I did ask if it was in response. Since I couldn't tell by the tiered replies.

I agree. I understand why AMD is preferred by many. Nvidia can be bullies, in my view. It wouldn't stop me from using them, if they were much more suitable for my needs. While I don't have any need for G-sync, I still think it deserves some praise.

I don't use them because to me, I am supporting their actions. So I made a choice a number of years ago to not support nVidia.

One thing that AMD has done wrong? Producing false information. The reported R9, the speeds of those cards were incorrect.

I'm not entirely sure what you mean, are you talking about the Tom's Hardware article where they're saying the core speed is 700 odd Mhz? Wasn't that also shown to be because of a faulty card?

Regardless, I'm not a fan of the boosting thing that AMD and nVidia are doing with the cards, I prefer a set clock speed.

Like the article posted about AMD trolling Nvidia's event, if you recall? While we all laughed, it was pretty underhanded.

Whilst I understand the hypocrisy of it, I have a hard time caring to be honest, nVidia have been doing much worse for a long time that I have a hard time feeling remotely sympathetic.

Plus, the stuff I'm talking about with regards to nVidia are much worse, the way they use PhysX and pay developers off, the rubbish they spread and the lies they tell, the way they have paid off numerous developers to try and push down AMD performance.

The whole "AMD BAD DRIVERS" thing is spouted from nVidia as well through nVidia's focus group programme, seriously, the slightest bit of research will show you that nVidia are the ones who actually have had the serious driver problems over the years. I'm not saying AMD doesn't have any, all companies do, but considering the reputation some people give AMD over drivers, it's just ridiculous. No AMD driver has actually killed graphics cards like nVidia ones have, multiple times, upon release of newer graphics cards.

The origin PC thing. Batman AA (the antialiasing code that in basic terms had a clause, disable AA if a non-nVidia card is detected) Crysis 2 (tessellated water being constantly rendered regardless of where you were to drag down everyone's performance, just AMD more). Assassin's Creed (DX10.1 being removed after it gave a performance boost to 10.1 cards, nVidia didn't have any 10.1 cards out). As above, PhysX, the way it drags everyone's performance down with effects that don't need to be done on the GPU, as well as nVidia's driver clause of "disable hardware PhysX if AMD GPU is detected". The way they have compromised FP64 performance in their latest cards, to artificially increase the perceived worth of their compute orientated cards, or the way they are selling Titans for $1000, despite the fact that a Titan doesn't cost much more than a GTX580 to produce. Just to name a handful of the underhanded things nVidia has done, not at the expense of their competition, but at the expense of everyone.

It later emerged that the 780 was faster after overclocking. They could have done anything to make that bench an unfair test.

This is one of those things, it depends where you go to read your news, but that aside, who in their right mind trusts reviews or bench marks from manufacturers? They're only going to show their hardware in the best possible light which means they're not to be taken seriously.

That aside, isn't the overclocking thing an issue of the stock/reference cooler being rubbish rather than the cards themselves?

I admit, I'm not fussed that they made them with  crap coolers but only because I watercool my graphics cards, so I will go for the cheapest card regardless of cooler, however weren't the benchmarks done with the 290X versus custom cooled 780s?

Of course, I do think they should have made better cooling units, however when you take price in to comparison, it's not the biggest issue, and those who are interested should just wait until there are custom cooling units available.

 

I didn't expect such a long response. Forgive some of my skip reading. I will try to answer the quoted points, anything after that I might miss.

It's your choice if you don't want to buy Nvidia. I only every recommend AMD cards up to a certain price point, up to the 280x or maybe the 290. After that, I really think Nvidia has a complete package. Certainly in the UK where you can buy a 780 for less than a 290x.

This is the AMD trolling Nvidia's event article:

https://teksyndicate.com/news/2013/10/17/amd-trolls-shows-r9-290x-vs-gtx-780-nvidias-event

Where they reported the R9 290x as being significantly faster than the 780

Here are some benchmarks from a reliable source. I never use Tom's Hardware:

http://youtu.be/djvZaHHU4I8?t=8m42s

As for GPU vs CPU PhysX. Here is a thread related to AMD vs Nvidia.

https://teksyndicate.com/forum/gpu/what-have-nvidia-actually-been-doing-rant-thread-about-people/157027

I actually argued in favour of AMD. Thanos makes a very strong argument against the notion that Nvidia happens to be anti-comeptitive and close sourced in this particular issue. It just so happens that developers are not using resources to implement CPU physx into their games. So it lands on the developers, more than Nvidia themselves.

All the same, you're going to have a hard time proving that AMD doesn't do many of these things. We don't know what goes on behind closed doors. Origin PC was speculative. We don't know if Nvidia did pay Origin PC to drop AMD's cards.

I'm not saying that I agree with the closed standards that Nvidia uses. But utilising Physx on their cards is not anti-competitive. They just don't bother to optimise their standard for use on CPUs.

I'm going to throw this out there.
"Q. How do you convince developers to use Mantle when other APIs are vendor neutral?

A. We haven't had to convince them. Every single one of them has come to us and asked for it without prompting!"

http://www.tomshardware.com/reviews/amd-ama-toms-hardware,3672-7.html

 
 

I don't think the sole utilisation is anti-competitive, what is is how they implement it.

Also, nVidia has restricted CPU PhysX to X87 as well as purposefully missing some instruction sets to ensure it runs like arse. This is what I mean when I say they are using it anti-competitively, they have it set up to kill performance for everyone, just not as much if you have an nVidia GPU.

It's not a case that they're simply not optimising for CPU performance, they're doing it on purpose so that they can exaggerate the performance difference between physics running on a CPU and on a GPU. It is entirely intentional.

PhysX being closed/proprietary isn't really the issue here, that is an issue with regards to developers taking it seriously and making use of it, something that hasn't happened and won't because they are keeping it closed for purely marketing purposes.

You also seemed to skirt around all the other anti-competitive things I pointed out.

It's not a case that I can't prove AMD doesn't do them, as I don't actually need to. We don't have situations where games run poorly on nVidia hardware for no obvious reason, and no such things have been found to be implemented (ie, if nVidia hardware is found, disable certain features).

Sure, Origin PC was speculative in terms of no concrete proof, though there was an article about it saying the acting CEO got reamed for it, and they had reached out to other companies, but Origin was the only one that took them up on the offer.

 

 

 

I don't understand what the argument is because that isn't anti-competitive. While physx could be generated by AMD GPUs, there's absolutely no reason why Nvidia need to allow that. It's their own innovation. That's called being competitive, not anti-competitive.

I didn't skirt around it. Rather, I provided quite a lengthy thread on the topic. You can see my thoughts there, and Thanos managed to provide sufficient evidence that CPU physx is not nerfed.

To me, this all looks like common business practice, not anti-competitive behaviour. There was a strong belief that physics would run better on CPUs (SSE, regardless of missing instruction sets), and that was sort of proven false. Again, in the thread provided.

There was nothing stopping AMD having their own solution, and TressFX emerged.

They are both guilty of making their company more favourable than the other. AMD secured the deal with Dice, to make BF4 an AMD optimised game. We are all supposed to believe that Dice begged AMD to help with the development of Mantle. Ignoring the $8 million AMD used to secure what was originally an Nvidia optimised game.

I think the only real hate people can honestly feel towards Nvidia is their price to performance. But if people are happy to pay the price, that's not anti competitive.

I used AMD up until now. My favourite game runs best on Nvidia. Nvidia had the only single GPU 1440p solution that I was happy with. It's preference, at the end of the day.

This is a case of company image, rather than any proven guilt.

I do now understand your point. That hating one doesn't make you a fanboy of the other. But, I didn't know if you were replying to one of my posts in that stack of replies.

Yep because game makers producing big budget TrippleAAA+ titles that push graphical limits want their games to run well. I think this will become standard for tripple AAA+ games, it wasn't that long ago that games were made with opengl and directx support, and nowadays you have engines like cryengine that allow you to simulatenously make a game for 3 platforms. I don't think it will be such a challange to implement vendor specific low-level apis in addition.

www.youtube.com/watch?v=2Y-vKJ5-6KM