What is the best GPU for Machine Learning for less than £2500($3100)?

Hi all,

I’m a PhD student looking for a GPU to use for ML. What are the best options?

Here are some I’ve found, that best fits my needs

2080ti with 11gb, 4352 CUDA cores
P5000 with 16gb, 2560 CUDA cores
Titan RTX with 24gb, 4608 CUDA cores, 576 Tensor cores

Thank you in advance!!

RTX2080Ti has 544 Tensor cores and P5000 has 384, FWIW

IDK if you read this?

Don’t buy any of those. The next gen GPUs are right around the corner and they should give you a significant boost in ML tasks.


@Buffy I didn’t, thank you, that’s good to know.

1 Like

But if you are in dire need of one now, you can check the used market for old 2070 and 2080s. If you are seriously budget constrained, you could wait until the ampere launch, and then pick up a used 2080ti when people upgrade

You could get 2x b-stock 2080Ti direct from EVGA for less than your budget. Plus still have warranty.

I would absolutely suggest you hold off until next gen. If you’re doing ML and tensor flow the 3080 and their rumored tensor core count would be amazing, but navi is coming and we don’t know if AMD is actually going to try more with ROCM and openCL or not.

Would be good to let the dust settle. If you absolutely must get it now, I’d probably go 2080ti with your budget

1 Like

Not sure if this is going to be useful a month later, but here goes. :smiley:

I’d be leaning towards the Titans in order to get the bigger frame buffer, but that depends on your models. If you’re working with a lot of fully connected networks, then one badass GPU will probably be more efficient since they generally don’t parallelize as well. Otherwise, dual 2080 Tis would fit in your budget, provide about the same amount of total memory, and probably better performance as well.

Whether you should wait for the next round of cards or not depends entirely on how quickly you need to be up and running. I generally don’t recommend waiting on future hardware, but ML loves compute to such a degree that it can make sense to wait, because it might actually make a difference.


This topic was automatically closed 273 days after the last reply. New replies are no longer allowed.