Well, look's like no one chimed in for a proper technical answer regarding the specs..
Performance wise, the 965M runs circles around the 960M. I included the GTX 970M in there also because that was mentioned.
If you look at the Core configuration (shaders:texture mapping units:render output units).
965M has +60% shaders / TMU's as well as double the amount of ROP's vs the 960M.
Where as 970M is double shaders/TMU's as well as triple the ROP's but is probably a bit of an overkill.
If you want to use desktop SKU's for comparison purposes:
960M is basically a GTX 750 Ti
965M is basically a GTX 960 (or a GTX 980 chopped in half)
970M doesn't have an equilevant desktop SKU right now but think of it as 750 Ti *2 :D
The desktop GTX 960 of course has higher clockspeeds, but you can easily adjust for that. Although it also has hell of a lot bigger power budget vs. the mobile version.
VRAM? Oh hell, that's a bit of a tough one as it really depends on the engine and how much stuff it loads in to the memory and and and..
Just displaying 2D images at 60fps in 32bits?
(1080*1900)+((1080*1200)*2) = 4644000 bits
4644000 * 32 = 148608000 bits
4644000 / 8 = 18576000 bytes
= 17,715 Megabytes * 60 = 1062,92 / 1024 = ~1GB.
But I do not have a proper answer as I'm not 100% familiar with the workload and in general about how much of munching power you'd need.
You mentioned that you had a HP DV6 before? What specs did it have?
I looked at the service manual and apparently the most powerful GPU it could have had, with 1024MB of VRAM would have been the HD 6770M.
It had a 480:24:8 core config so only 8 ROP's which isn't all that much. Also it's desktop variant would be the HD 6570.
That would have 696GFlops of raw, single precision, munching power which would make the 960M have almost twice of that (1317GFlops).
This is of course just GFlops, how many floating point operations a GPU can do in one second. But it's at least some sort of a measuring stick.
Pixel fillrate is another metric you could look at. The HD 6770M is 5.8 GP/s compared to that of 16.5 GP/s of the 960M. Basically how many pixels the GPU can vomit per second. GP is gigapixel or 1000 megapixels or one billion (10^9) pixels. 2x1080x1200+1080x1900 would be 4.7Mpix * 60fps = 282Mpix/s, not that big of a deal..
Texture fillrate is another but I don't understand it... HD 6770M is 17.4GT/s and 960M does 41.2GT/s. GT in this context means GigaTexels (or Texture element or texture pixel). Don't ask, I haven't got a clue.
This last section, or half of my post is way outside of my comfort zone as I don't know crap about graphics and how much performance you'd really need and polygons and magic pixies and..........
But it's probably safe to say that the 960M could be just fine for what you need, 965M would be better of course, especially if you want to ensure the smoothness.
Could you run some performance tests on other hardware you have at hand?