I'm looking to buy a new monitor, I'm no means a photographer and I don't edit pictures or videos but I do prefer to see nice images and play games at 60Hz on an IPS panel rather than 144 on TN. What I basically would like to know is would there be a big difference in imagine quality in a 4k IPS panel with 8 bit colors vs a 4k IPS panel with 10 bit colors? I'm confused and not really educated in this but I do like to see great quality pictures and colors
In order for you to see any difference the displayed image itself needs to be encoded in 10bit (or more).
At this point virtually all images and videos are 8 bit. I don't think many (any?) games can output 10 bits either, so unless you are creating pictures yourself there's not much point going for 10 bit right now.
Not just that, the GPU needs to be able to output 10bit too. And currently only the professional models (Quadro, FirePro/Radeon Pro) do this as far as I know.
A color meter will make the colors more consistent, but not necessarily noticeably better. Since the goal is not to create content a color meter seems like a waste of money.
Even on my viewsonic TN gaming panels I see a noticable difference in color and contrast but I do agree with you that a good monitor to start with is key before considering a color meter.
No. It is pretty simple:
all the pixels all the colors no money
Pick two. I am ignoring other specs and features here on purpose
Currently I would point to 27" or so 1440p Dell Ultrasharp. Those are pretty cheap these days and most of them have amazing colors right out the box.
I pretty much agree with @noenken : Ultrasharps are cheap and supposed to be good (I've never had one though), and 1440p is frankly enough at 27''. 4K at that size will only cause display scaling issues.
Fun aside. I've got a 23" 1080p LG IPS LED display and calibrated it with a Sypder 3 colorimeter. I've been using this for professional photography work and it's almost completely sufficient if you zoom and work on photo at 1:1 scale (100% zoom - which everyone should be doing for such things)
A 1440p or 2160p (4k) IPS of any sort with calibration is already complete overkill. 10bit is if you're working for Xerox/3M or some professional multinational print/design company and color precision is excessively important to you. (Most peoples eye's need calibration at this point)
What you really want to pay attention to more than any other features is your black and white levels if color is important to yo and calibration helps you overcome the blue/pink/green tint that almost all monitors have.
Now after all that, bear in mind that cheap 2160p monitors are a thing and can often be worse than high/mid quality range lower resolution monitors at the same price.
Colour depth shouldn't affect black levels, that's more dependant on the type of Panel. VA panels will have more contrast/deeper blacks when compared to IPS panels, for example.
Sorry for my lack of knowledge, but what GPU are u using cause as far as i know gaming gpus (nvidia, dont know if AMD) dont have 10bit output enabled. They reserve this feature for pro grade cards like quadro. Is particular monitor u re researching true 10bit or 8bit+RFC(fake 10bit)? IMHO for gaming 8bit + RFC is sufficient enough. Overall its more dependent on quality, type of panel, nits of brightness and polarizer.
Okay, I find them blacks being most troublesome like the colors could be better by dialing more gamma so the picture is similar to some fresh vibrant tattoo, but then its impossible to play games like that, and have to dial it back to washed watercolors.