Gaming on a 13 year old PC

I recently dug an old PC out of my scrap heap to play with some old PCI sound cards I found at a local used computer shop (that’s a story for another thread.) I decided to see what I could do to get it capable of playing modern games. And the result was surprisingly playable…

Base PC: Dell Precision T5500
CPU: Xeon E5620 (2.4 GHz)
RAM: 12GB DDR3-1333
GPU: Nvidia Quadro FX 3800 (seems about equivalent to a GTX 260)

CPU daughterboard (to upgrade to dual Xeons)
Two X5687 CPUs (X5690s have more cores but are stupid expensive; X5687s also have slightly single core IPC)
MSI “Air Boost” Vega 56 (from my spare parts)
Added 12GB of DDR3-1333 (also from spare parts; 24GB total)
Corsair RMS 850M PSU (stock PSU died a loud and painful death)

Preliminary results:

Cyberpunk 2077 on ultra gets roughly 45 fps @ 1080p. Actually, the High preset is about identical. The magic happens when you go to “Gameplay” settings and reduce crowd density to Medium and setting HDD mode to “On.” This costs some detail, but boosts fps into the 80s. This makes sense since the CPUs are fairly weak compared to a modern machine, and having less to process boosts framerates.

Downside: Power consumption is stupid high. Those two X5687s have a thermal output of 130 watts each. And a Vega 56 doesn’t exactly sip power either.

Anybody else try to turn ancient hardware into a gaming machine?


I originally played Cyberpunk 2077 on an old Nehlam i7-920 with a HD 7970 GPU, ~30 FPS in open spaces, and down to 15 fps in Night City - totally playable. :confused: Other titles like Doom Eternal worked like a charm (with settings lowered enough). Not bad from a bunch of free parts cobbled together from friends. All the good components got thrown in my Linux workstation instead, which with GPU passthrough has made my old rig redundant.

While an X5687 shares the same architecture and core count with an i7-920, the X5687 is roughly 1 GHz faster. And there are two of them in this particular rig, doubling the core and thread count. That’s probably why I can get 45 fps in crowded areas compared to your 15 fps (well that, and the Vega.)

This was just an experiment to see what this old hardware could do, and it turned out better than I thought. I do have an old 7950 laying about that I could experiment with as well.

That’s awesome, I do have a retro PC rig I use, but it’s a Dell Pentium 200 Mhz with a Radeon 9250… the kids love magic school bus and Encarta 98 :smiley:

1 Like

Did run X5670 / X5672 for a bit, before mainboard soiled the bed [unceremoniously]
Initially used W10, for some component assessing [GPUs]. Prominently used XP tho
Majority of games not involving steam [Steam briefly worked, before doing eFF ALL]

Package was replaced to i7-4790… + yes, a major bump, thanks to 3.6GHz


For laughs I kept using a Core 2 Quad Q6600 paired with a GeForce GTX 460 to test games out, it was tempting to drop in a GTX 1050Ti but it was unrealistic to buy another GPU. I later dropped in a GT 710 with GDDR5 to see how much of a difference it would be, interesting enough the drop in performance wasn’t as bad as I expected. (had got the 710 more for shaving off power usage for non-gaming testing)


Wouldn’t the lack of instruction set extensions such as AVX2 be a problem in certain titles? I gave my mom my old Sandy Bridge PC with i3, but I might just use it for hosting servers or playing lighter weight games instead.

1 Like

1 of 2 options would happen, for not having such instruction sets, dependent on the game
It’ll run, but in a poor state, coming off unoptimized… Or, it’ll just laugh at you


Instruction sets like AVX are most useful for SIMD processing (applying the same instruction to lots of data). For most games their SIMD processing is mainly done with graphics - and that’s already handled by the GPU. As the 5800X3D benchmarks showed, there are many games that are simply limited by memory access speed.

Take cyberpunk 2077 for example, it came with AVX required, but with a mod to remove the requirement it still ran decently on my nehlam i7.

1 Like

There is a similar problem with Mass Effect Legendary Edition. The launcher (not the game itself mind you) shipped with an AVX requirement. Fortunately, there was a mod for that, and the game was launchable with the third party launcher, though it wasn’t um, a sterling example of UI design.

Oooooh okay.

That reminds me when Swiftshader was a thing and I tried my hardest to get GTA working with it, as my GPU didn’t support Shader Model 3. First Bioshock had something like that too that would downgrade the game and make it run on lower end hardware. Great times.

1 Like

Hell, between the time when I sold my ‘old’ Broadwell-E platform and bought my current Comet Lake one, I reassembled and used an old AMD Phenom II x4 965 and bought a GTX 580 for it for like $40.

No, it wasn’t going to play anything in VR or ultra-max at 4k, but all of the popular titles I play on a semi-regular basis (League, WarThunder, Dwarf Fortress, Bannerlord, Black Desert) ran acceptably well at 1080p. Sure I had the details cranked down to ~medium in things that weren’t easy to run e-Sports titles and AA in any form got axed, but as long as you don’t have some hard requirement for AVX/etc in a new title I think people really underestimate what older systems are still capable of.


Still have to crank down to medium ish with my current setup in WT.

That’s very much true, saw it a lot on old PC with games I played. And yeah, makes sense that it might have hard limits in some cases and not the others.

1 Like

Best extreme I’ve seen, was with Warhammer 40K: Space Marine
In XP, it was hardlocked to 30fps, no matter how built your rig is / the settings set
Yet anything W7+, it’d run at 60fps with ease [turned out to be DirectX affiliation]
…+ offered up some additional visuals

1 Like

Curious about your Haswell upgrade…did you keep the same GPU you used with the X5670? Did you run any before/after benchmarks?

It was a 0 diff, when moving from X5670[2.9GHz] to X5672 [3.2GHz]
All other games I was throwing at it, encountered various fps gains
Regardless if it was avg and/or minimum, having the bigger performance bump

One other game I can recall, having some fixed fps limit, was Bioshock 2 [60]
… The move from 2.9 to 3.2, made that 60 be basically a static sprite