Hi so I’m looking to invest in a 4k monitor to replace my 1080p 27 inch 144hz screen. I already have a second monitor which is my pixio 32 inch 3440x1440 ultrwide curved screen. But the colors are really bad. I thought about upgrading my wacom intous pro to a cintiq. I found a used cintiq on ebay for 1500. The screen specs are very good for digital art. It’s got a 4k ips panel with 99 percent abobe rgb coverage. Problem is 1500 is still a bit much but at the same time might be tempting since your paying the Monitor as well as the tablet functionality. I’m trying to find a monitor that is on par with the cinitq and is geared towards professionals since I think it might be the cheaper option. Any recommendations. I’m only seeing monitors geared towards gaming and it is kind of pissing me off. When I try to search for professional grade monitors I get ones that are well over 2000 dollars. At that point I might as well just buy a cintiq. Since I think spending that kind of money on just single display isnt worth it.Any recommendations? Is it even possible to find a monitor like that under 700 us dollars
go to www.rtings.com and look through their reviews
You can also look into used medical diagnostics grade monitors like EZIO. Those bad boys have self calibration capabilities and does 10bit color depth with no ramp up time from a cold start.
The only issue is the cheaper 2nd hand ones are probably 1080p and only goes expensive from there if you want a higher resolution one.
Those are also thicc bois. Very thicc bois.
Ya man like I said I’d rather not spend 1700 plus on a monitor. Also my monitor I’m upgrading is a 1080p asus. So kind of want to take advantage of the 4k resolution. Not sure if im looking for something maybe like a Dell ultrasharp or LG ultrafine. Gonna have to do more research.
I recently got an GIGABYTE AORUS FV43U, 43". Good screen, comes pre-calibrated from the factory. Has pretty good color spectrum for a regular non professional grade monitor.
Depth: 10bit (1.07 Billion Colors)
FRC: yes (8bit+2bit)
Color space: 150% (sRGB), 99% (Adobe RGB), 97% (DCI-P3)
Well you should probably narrow down which color gamut specification you’d like to meet for starters…
The following is my neophyte knowledge on colour correction, it is possible I could be missing something; for obvious financial reasons I have not personally experimented with these kinds of monitors or equipment properly able to analyse them
If you are planning on looking into colour correction, note that there appears to be a distinct difference between colour correction done by the display vs done by the computer.
Professional monitors seem to be the only ones which do this correction internally. As I understand, how this works is that the computer sends the monitor an untouched sRGB/BT.2020/P3/etc. signal, and the monitor internally has a colour profile it uses to convert the incoming sRGB/etc. data into a skewed signal that should correct for display panel variations/defects/degradation.
Colour correction on “normal” monitors seems to be done outside the monitor. I do not know how often this is done in software or by GPU hardware. Regardless, this means that DVI/HDMI/DP is not actually sending a real sRGB/etc. signal, the signal has been skewed in such a way to counteract the variations/defects/degradation of the monitor as a whole. This means that any “effects” applied by the monitor/TV during calibration are also being counteracted. If for example, you accidentally leave a vivid colour mode on during calibration, the profile generated will work properly while vivid mode is on.
While software/GPU colour correction can be implemented/configured by various software (ex: DisplayCAL), from what I have read, monitor-internal colour correction needs to use the monitor manufacturers’ software to load the calibration profiles into the monitor. I have never been able to find how this loading occurs, maybe DVI/HDMI/DP define some vendor specific options in their protocol implementations?
Colour correction seems to be one of these fields, like audio, where empirical knowledge has been so prohibitively costly to acquire in terms of time and testing equipment, that most people rehash marketing claims, so long as what the are doing and the products they are using work well enough for them.
I would desperately love to see someone review a TV/monitor, and actually test how much each Vivid, contrast, etc. “feature” actually distorts the ΔE accuracy measurement. I have tried to read as much as I can about this stuff, but am still unsure if a change in the “brightness” setting would result in a colorimeter or spectrophotometer seeing the device as less accurate.
I think I once saw some professional monitors being advertised with higher-bit colour depth, even when only displaying sRGB (which is 8-bit per channel, right?); maybe in those cases higher bit depth refers to the panel itself, not the signals the whole monitor can natively receive? A higher bit depth might allow better representation of that skewed signal (the colour correction result) being sent to the panel, even if the image to be displayed is only 8-bit/channel. Have I actually seen this explained or tested anywhere? Sadly, no.
My apologies for the rambling read; I have been meaning to make a thread about colour correction, and so some of those thoughts have oozed out here.
I have an Asus pa329c … Asus firmware / software is a piece of sh**.
The physical display panels are nice.
Factory calibration is just too green.
The internal calibration or USB assisted calibration basically doesn’t work and there’s no calibration for HDR. For non-HDR it crashes mid calibration and gets monitor stuck in some wonky modes you can get out of and need to cut power to monitor to get it to recover.
In non-HDR, display cal with GPU mapped colors can work better. What happens is your Nvidia GPU will process colors at 16bit internally and will try to map them to their closest spot on to get the 10bit output looking ok, but it takes very long time to calibrate even with “fast” calibration modes that take 30 minutes and I end up with crushed highlights for some reason on the PA329C… so I end up calibrating twice a year/never as it takes about 2h every time. (Recommendation for high gamut panels where you need to rely on them is every week with heavy use, every month with light use).
In comparison, I’ve been hearing nothing but good stuff on the BenQ PD series and BenQ SW series monitors firmware.
I think maybe I’d have been happier either spending double what I did on a PA329C to get a pro monitor with uploadable HDR LUTs.(or maybe even an LG CX / LG C1 TV).
… or going with a cheaper BenQ PD3200U.
colour gamut and bit depth are almost entirely unrelated, there just tends to be standard pairings (like HDR 10 being 10-bit, but it doesn’t have to be). Colour gamut is a function of the display and not the signal, the signal only describes how much of each colour channel there is but what those colours are is determined by what the display can output. So for example a film will be graded for a certain colour gamut, and if you watch it on a display that has the same gamut it will look normal, but if the display uses a different gamut and there’s no conversion done it will look wrong. As for bit depth that only describes how many points there are along the gamma/PQ curve not what the curve is or the size of the colour gamut. So you could have sRGB or BT.2020 using 1-bit colour or 32-bit colour, it only changes the resolution not the colour range.
I confess, I have not read IEC 61966, but sRGB stipulates (according to wikipedia) a peculiar hybrid gamma-linear transfer function, so it seems as though it is already more than just a colourspace/gamut definition?
Anyway, I was trying to use shorthand for the normal sRGB-style data that almost all monitors expect by default since DVI. As far as I am aware, that would be 8-bit per channel, sRGB gamut, sRGB transfer function (gamma).
If you have an OS that has no support for colour management, is this not what everything falls back onto? I assumed this was why PNG (and probably other formats as well default to an sRGB, and why the standard bit depth (ex: web #______) is 8-bit per channel.
All I can say is PRO-ART. Asus factory calibrates them and my 16:10 full HD one is absolutely fantastic.
This one specifically may peak your interest:
Or this one:
Honestly, you’d be surprised what you can pick up on eBay. One of the few others that are good for creative work to the spec you are looking for could be some of the dell ultrasharps or BenQ monitors, but asus has a good rep behind them.