Use a Workstation and gaming GPU at the same time in SLI/Crossfire?

I have been having trouble finding a difinitive answer to this-

I am planning on building a PC- while I'm not a gamer, I'd like to get into it a little bit.

I am going to college to become a Digital Animator/Illustrator and was looking at the ASUS PA279Q due to the 10 bit color and accuracy.

From what little I know on Workstation cards-  I HAVE to have one to get 10 bit color.

Now, I want to possibly use a more powerful Gaming card and just have a lower end workstation card to get the color output.

-----

Now is this possible with either AMD or Nvidia? I know you can mix Nvidia cards in applications like vegas and adobe premier but I don't know if that's the same with gaming.

I will be getting a 760 (or higher) card due to the price-to-perf if that matters.

 

thank you in advance for reading this- I really do appreciate any help with this-

I have no idea on what you mean by 10bit color - as for as I know, even integrated graphics have supported 24 - 32 bit color for years. https://en.wikipedia.org/wiki/Color_depth

Now, I did look up the monitor, and judging from the marketing material, the 10bit color BS is actually in the monitor hardware/firmware itself, and is NOT GPU dependent. Basically various calibrations. But that's just an educated guess on my part.

Now, I would invest in a workstation GPU just for the nature of the work, but I'm not sure if you can dynamically switch the primary GPU based on application. I suspect if you had a 760 as a secondary GPU, you can use that to render you games, but I'm not sure.

Also, SLI/Crossfire is a technology that allows two similar/identical GPUs work in together to complete one task - what you're asking is if you can have two GPU's work independently of each other for different tasks. Again, I suspect it's possible, but I haven't had two mismatched GPU's in my system long enough to play around with.

My understanding of color production: Your video card is capable of producing deep color. This has been possible for some time since color/content is produced on computers nowadays rather than in larger than needed color rooms with older tools (pre-computer press days)

So, video card isn't your limitation; it is capable of producing deep color. Read the GTX760 userguide and it specifies this:

http://www.nvidia.com/content/geforce-gtx/GTX_760_User_Guide.pdf

Page28 lists the HDMI specifications. Dual-Link DVI is capable of deep color as well.

 

Onto a very laymen term website but decently spelled out to "us less educated users(not producers)"

http://www.soundandvision.com/content/xvycc-and-deep-color

Note: that article is 7years old to almost the day.

It comes down to the weakest link in the chain. If you're producing content then deep color reproduction is important. But, if you're producing content in a world that still has mostly 8-bit color panels, where blu-ray is still using 8bit, then you're producing colors that won't be seen by the large majority of users. Content consumption is changing - standards are changing - technology changes - 4K, better panels, etc...

Quoting from blu-ray wiki:

For video, all players are required to support H.262/MPEG-2 Part 2, H.264/MPEG-4 Part 10: AVC, and SMPTE VC-1.[128] BD-ROM titles with video must store video using one of the three mandatory formats; multiple formats on a single title are allowed. Blu-ray Disc supports video with a bit depth of 8-bits per color YCbCr with 4:2:0 chroma subsampling.[129][130]

Future proofing yourself? Perhaps. This industry is changing quickly as a result content production changes. No longer is content ONLY produced by the big production houses. So being visually appealing, and innovative will go a long way.

 

In the end: As a non professional i might be wrong; but i don't think the color bottleneck will be your video card. It will be the panel that displays your content to the person viewing it.

NVIDIA Geforce graphics cards have offered 10-bit per color out to a full screen Direct X surface since the Geforce 200 series GPUs.  Due to the way most applications use traditional Windows API functions to create the application UI and viewport display, this method is not used for professional applications such as Adobe Premiere Pro and Adobe Photoshop.  These programs use OpenGL 10-bit per color buffers which require an NVIDIA Quadro GPU with DisplayPort connector.

._.

 

I didn't expect to get a response and neglected to check this!

Okay, thank you all for helping me with this...I see no need to waste money on the workstation card now.