Best GPU for under $300

Right now im trying to decide which GPU to buy. Im not sure if I want to get a AMD 7870, A lower end 7950 (going like $10 over is not problem) or a GTX 660 TI. I would pretty much only be using it for gaming and maybe a little editing but gaming is what would be main priority.

To me it seems like the biggest difference between the two camps is the support of either Physx or OpenCL. I would base my decision upon that and perhaps which brand you prefer on those two (nvidia/amd).

For me, EVGA is great for nVidia. They have great customer service and great programs if you decide you want to get a better video card within 90 days of your purchase.

http://teksyndicate.com/reviews/2012/09/24/his-radeon-7870-iceq-ghz-edition-review

Logan already has the answer for you

7950:

http://www.newegg.com/Product/Product.aspx?Item=N82E16814202006

http://www.newegg.com/Product/Product.aspx?Item=N82E16814150616 (After rebate)

 

7870:

http://www.newegg.com/Product/Product.aspx?Item=N82E16814127681&Tpk=7870%20HAWK

 

660Ti

http://www.newegg.com/Product/Product.aspx?Item=N82E16814127696

 

IMO, I don't think you would go wrong with either of these four cards.

 

Of course, the HIS 7870 is a very good deal for what you get, but if you're looking for the best card under $300, I like the GHz edition MSi 7870 more. I have seen some benchmarks where that 7870 is right up there with the 7950.

 

Ugh...you guys are making it so that I have to spurt my knowledge all over your faces.

PhysX is a load of crap that is supported in very few games and is more of a marketing gimmick than anything else.

As far as I know, Open CL only has its advantages in certain softwares for video rendering, 3D modeling, etc. The same goes for Nvidia's CUDA technology. Those technologies have nothing to do with games.

Don't base your decision on a recent video, when the original poster clearly said he can get two cards that are better than the 7870.

Always get the card with the best performance:

http://www.anandtech.com/bench/Product/550?vs=647

The 7950 and 660TI both exchange blows depending on which game they're running. Obviously, both cards perform better than the 7870, so there's no reason to even consider that card here.

Take either the 7950 or 660TI. See which one performs better in the games you're going to be playing.

I would probably go with the 660TI, because it performs better in most games, and definitely because it performs better in Battlefield 3. The 7950 does perform better in Metro 2033 though which has me scratching my head a little bit. I guess Metro just runs better on AMD cards?

Ill be laughing at you in 2 years when im playing all my games with physx...

I have an HD 6870 in my system and its running with my old GTX 260 acting as a stand alone physx card (you can look up online how to do this, not too hard). I did it simply because I wasn't able to sell my GTX 260 when I got my HD 6870, so I thought it would be cool to have it, but nothing really takes advantage of it, only game I have that has physx is batman though I haven't actually gotten around to playing it.

That's what they said 2 years ago...and two years before that...and two years before that...and...

I favor the 7950 because with an overclock it is about the same speed as the 660ti... and I like the image quality a lot... also, with the latest drivers, a few games are much faster. Skyrim is now running better on the AMD as is Metro... Oh, and Metro is insane on AMD cards. That 7870 I tested last week beat the GTX 680... Which hurts my head. 

The 660Ti is still an awesome card. I would prefer it for Adobe CS 6, Crysis, and a couple other games. 

End yes PhysX is a joke marketing scheme and looks like ass anyway.

The Galaxy 660TI 3GB version is $299 on Amazon right now. I checked it's reviews on Newegg, it seems like a pretty awesome GPU.

The reference EVGA 670 dropped to $350 if you want to spend a little more. 

@Scraps, until next gen consoles come out, most things like Physx will be held back (imo... just imo).... I think we will see a big jump here once the 720/PS4 release.

most games are 85% GPU dependent, so why waste precious GPU resources that can be done with the 85% of your CPU power thats not being used in the game, if the CPU handled more physics and what not then things like 8150FX might be a bit more relevent to games seeing as the multithreaded games would be more promenint instead of a $65 intel dual core that gets higher FPS

 

you know back in the day it was the GPU assisting the CPU, not the other way around, thats why I think there should be where you can allocate your unused CPU power to software rendering to help out the GPU since games are more and more graphics driven