Comments Locked

51 Comments

Back to Article

  • ImSpartacus - Thursday, April 28, 2016 - link

    Fascinating read. Intuitively, I didn't anticipate a downside to a wide gamut monitor, so that's interesting to learn about.

    Though, honestly, as a layman, I have little use for something with that much potential. Cheap Korean panels, hooooooooo!
  • nathanddrews - Thursday, April 28, 2016 - link

    This is only the beginning. With the advent of Rec. 2020 HDR monitors, we are looking at an even more complex calibration system. Even now, HDR televisions effectively have no way to calibrate to a standard. Supposedly Microsoft is working on a W10 update to improve color support for 10-bit and HDR... but without a sincere, ground-up overhaul of color handling, I don't think it will amount to much.

    Gotta keep the hope, though. I really want to play some HDR games on OLED.
  • Michael Bay - Thursday, April 28, 2016 - link

    I dread the day my insider preview build will mention color profiling. Bugs will be atrocious!
  • crimsonson - Thursday, April 28, 2016 - link

    Dolby and SMPTE have offered each a standard for HDR.
  • nathanddrews - Friday, April 29, 2016 - link

    Both of which have no good calibration method yet. Ideally, the unit would come pre-calibrated for both dark and bright environments, HDR10 and DV. Of course, if the player and TV could talk to each other to constrain the nit levels to the calibrated display output... I would be in heaven.
  • BrokenCrayons - Thursday, April 28, 2016 - link

    Yup, monitors that attempt to display colors accurately or at higher resolutions seem like sort of a pain in the backside to deal with. Certainly, there's good reasons for them, but I'm perfectly happy using bottom-feeder 1366x768 screens for the moment. Higher resolutions aren't important and neither is color accuracy. It's great to see the technology maturing, but you could stick me with a smeary old passive matrix LCD panel from a mid-90s era laptop and it'd be fine.
  • Solandri - Monday, May 2, 2016 - link

    As someone old enough to remember the loss of color gamut moving from CRTs to LCDs, I can't wait for sRGB to die so we can move to a more realistic color gamut like Adobe RGB (which is fairly close to the NTSC standard used for CRTs).

    The color profile problem is only a problem when displaying pictures with a certain color profile on a screen with a different color profile. The problem goes away if everyone uses the same default color profile.

    Unfortunately for 20 years now, the default has been the atrocious sRGB. Decades from now, photos and movies shot in this era will be notable for their lack of color saturation, because they were made for the sRGB color space. I quit shooting JPEGs with my DSLR precisely for this reason - my camera's RAW photos cover Adobe RGB space, so by shooting as JPEG I was throwing away a lot of color information.
  • willis936 - Thursday, April 28, 2016 - link

    Excellent write up. Lots of info for getting people up to speed on color management.
  • Michael Bay - Thursday, April 28, 2016 - link

    First in was Mami, and now R;N. Animoo certainly conquered AT in these couple of years.
  • Jacerie - Thursday, April 28, 2016 - link

    There's a typo in the title. Quantum is spelled with a U not an O.
  • Brandon Chester - Thursday, April 28, 2016 - link

    Ryan was up very late doing some editing and must have made it when he expanded on my admittedly sparce placeholder title (Monitor Review). My apologies.
  • Infy2 - Thursday, April 28, 2016 - link

    The message of this article is for average Windows user to stay away from wide gamut monitors.
  • Murloc - Thursday, April 28, 2016 - link

    average user thinks oversaturation looks cool
  • watersb - Thursday, April 28, 2016 - link

    Excellent. Thanks for this in-depth discussion. I know very little about color and color management.

    Yesterday, I was in an Apple Store and I compared wide-gamut images side by side on the new, 9.7-inch iPad Pro, the 12-inch one, and the 5K iMac. I used iconFactory's blog post for reference images. Wow. http://blog.iconfactory.com/2016/04/looking-at-the...

    This is becoming a real thing for popular consumer devices. Interesting times!
  • theduckofdeath - Thursday, April 28, 2016 - link

    The only thing I'm getting from this review is, I have a strong feeling that markets with stronger marketing regulations will soon nerf the Quantum Dot term the same way "LED" displays were a few years ago. The marketing implies that QD is as advanced as OLED while the displays clearly still use edge lighting with all of its issues.
  • saratoga4 - Thursday, April 28, 2016 - link

    The marketing on hype on QD is particularly ridiculous given that they're essentially a cost-reduction measure designed to save a few dollars on multi-color LEDs or OLED while (hopefully) being good enough.
  • Murloc - Thursday, April 28, 2016 - link

    80$ is not a few.
    A new thing or a cost reduction are the same thing in this case: consumers will have something they didn't have before.
  • saratoga4 - Thursday, April 28, 2016 - link

    Going from 1 type of LED to 2 types of LED in an array doesn't anywhere near $80. The savings is much larger compared to OLED, but OLED has other advantages beyond gamut that QDs can't match anyway.
  • name99 - Thursday, April 28, 2016 - link

    I think you're missing the larger picture.
    Of course any technology can be cost-cut to the point where it is a joke, and Phillips seem to have done that here. OK, Phillips being stupid, nothing new there. But that's not interesting.

    The more interesting aspect is that we are moving towards richer monitor technology. It started with retina (sorry HiDPI !) displays, now we're going to wider gamut. At some point wider gamut is going to move to something like 16 bits per pixel rather than 8 (or occasionally 10 or 12), along with maybe even 4 phosphors. And at some point the standard device frame rate is going to up to 120fps.

    OK, so with this hardware background, it's now interesting to contemplate the SW background.
    In one corner we have MS. Apparently still incapable of handling color correction after all these years, and incapable of handling the UI. Ad that to their HiDPI support. They seem unlikely to adapt well to this new world...

    In the second corner we have Android. Not clear to me how much better off they are. They have handled DPI a lot better, which is a good start. As far as I know there is still no color correction built into Android; but the larger issue is one of how easily their architecture would allow for inserting color correction. Can they do it in such a way that all (or at least most) apps just do the right thing? And would it rely on the phone OEMs to create drivers and lookup tables that most of them would screw up?

    In the third corner we have Apple which seems perfectly positioned for all this (meaning that they will likely drive it). They've been happy to push hiDPI (including on OSX as fast as Intel's built-in GPU's allows it ---which wasn't very fast, suggesting that maybe they'd be better off with another vendor for OSX SoCs, but that's a different issue), and they're now pushing color accuracy both on the camera side (TrueTone flash, high dynamic range sensors) and the screen side (new iPad Pro screen, presumably to spread throughout the product line as manufacturing volumes and power budgets allow).
    I fully expect them to stay on this path for a while, never actually stating technical phrases like "full Adobe RGB Gamut" but constantly subtly pointing out in their keynotes and advertising "Our colors look good, and look correct, across ALL our devices --- photos on your iPhone look exactly the same on your iMac. Good luck getting that consistency with photo from your Android phone on your Windows screen."

    From this point of view, then, the relevance and interest of QD technology is whether it allows larger gamut to move to iPhone this year or at least soon.
  • jlabelle - Friday, April 29, 2016 - link

    - Apparently still incapable of handling color correction after all these years, and incapable of handling the UI. Ad that to their HiDPI support. They seem unlikely to adapt well to this new world... -

    such statement is not correct and the article describes it pretty clearly. Beyond the way to set it up (which, yes, is somehow confusing), the real issue is simply that many programs are not color managed.
    This is not only limited to Windows and OS X is suffering of the same issue so it has nothing to do with Windows per see but the programs you are using.
    The issue behind is that some default program on Windows are not color managed. It seems it is the issue with Store app (like it is for iOS apps that make iPad useless for photo editing for instance). So some important apps like Photo and Edge do not take care of that. That is a big issue.
    But many programs does.

    That is why there are 3 different cases :
    1/ Use a screen very accurate within sRGB gamut out of the box - only use sRGB images --> no issue anymore but obviously you will never display any image beyond sRGB

    2/ Use a screen with sRGB gamut (or a wide gamut screen that you switch to sRGB mode) with calibrated with an ICC profile set as default (as described) - use only sRGB images --> here, you will have perfect color accuracy for all applications color managed. In case of applications not color managed (Edge, Photo, Chrome...), you will have the color inaccuracy of the screen default (because ICC profile not applied) BUT you will not have images under or over saturated. Therefore, the impact will still be minimal for the user.

    3/ use a wide gamut screen : then, you have no other choice that carefully use color managed application --> for every application color managed, display will be fine and you will take advantage of the wider gamut. For all others, the images will appear oversaturated.

    It is such an issue that I used to have a wide color gamut DELL U2711 screen.
    1/ first, you only have a good accuracy in color managed applications but in others, everything is oversatured.
    2/ Second, while shooting FF DSLR in aRGB, I may have seen less than 10 pictures out of 70 000 where you could see in an direct A-B comparison a tiny difference between the sRGB version and aRGB. In real world, it is VERY unlikely to go beyond sRGB.
    3/ Third, even if you keep for you aRGB versions of your pictures (to take advantage of your screen), you have to have a sRGB copy because when you share it outside, other people will face the issue on non color managed application that your pictures will be completely washed out. And even many online print shop only take sRGB.

    At the end of the day, it is so much a hassle for virtually almost 0 visual benefit (speaking of photo of real color in the nature) that I now have a Dell U2313UH which is a sRGB gamut screen.

    Bottom line : wide gamut screen currently is a chore and NOT recommended. And not only Windows, nowhere because even if your browser is displaying correctly the image (Safari, Firefox with a certain flag activated), what is the point then to have a wide color gamut screen to see sRGB pictures ?
  • jlabelle - Friday, April 29, 2016 - link

    - In the second corner we have Android. Not clear to me how much better off they are. They have handled DPI a lot better, which is a good start -

    If you are speaking of Android, you should compare that in Windows Store with Windows apps from the Store.
    For those, the scaling is just perfect and it is handling ANY screen size / resolution / orientation perfectly.
    Only issue with scaling are Win32 programs not using hidpi API released 9 years ago with Windows 7 (at a time where Android was not a thing).

    - As far as I know there is still no color correction built into Android -

    Android is the worse on this because you have virtually 0 color management.

    bq. In the third corner we have Apple which seems perfectly positioned for all this (meaning that they will likely drive it).

    Again, this is misleading.
    For instance, iOS way of handling color management (see test on the iPad Pro) make the use of wide gamut screen virtually useless (for now) as there are no ways for a developer to take advantage of it. What it seems to do is basically apply a ICC profile to target sRGB color space.
    Scaling is not a question really as resolution are pretty much hard coded but again, Windows app are scaling perfectly.

    OS X has some "native" applications color managed (like Safari) but the same issue occur that the program needs to be color managed otherwise you have the same issue.
    For scaling, this is exactly like Windows with hidpi API existing like forever and developer just need to use it. Maybe there are more application which are using it. But that's it.
    OS X does not have really (from an OS point of view) an inherent advantage compared to Windows on color management / hiDPI screen.

    bq. they're now pushing color accuracy both on the camera side (TrueTone flash, high dynamic range sensors)

    actually, Apple is using 1/3" camera sensor, one of the smaller size in the industry (or only found in low end phone like Lumia 640XL...) and therefore the dynamic range is more limited than the competition (because it is mainly directly link to sensor size).

    - and the screen side -
    nothing exclusive to Apple. For instance, speaking of Windows here and therefore the Surface or the Lumia 950, they both have more color accurate screen that all the various iPad and the iPhone (albeit all are VERY good in color accuracy).

    bq. "Our colors look good, and look correct, across ALL our devices --- photos on your iPhone look exactly the same on your iMac. Good luck getting that consistency with photo from your Android phone on your Windows screen."

    It is no luck. Just pick the right product. If you pick a Surface and a Lumia 950 for instance, you will have the same great experience. And using a Samsung S6-S7 or accurate Android phone will give you the same.

    Seems indeed that advertising is working correctly for people to believe that Apple has inherent advantage here.

    - the relevance and interest of QD technology is whether it allows larger gamut to move to iPhone this year or at least soon.

    Until developer can take advantage of it, it has not advantage for end user. So as good is the color gamut of the iPad Pro, it is useless from an end user point of view.
  • Brandon Chester - Friday, April 29, 2016 - link

    I've already addressed why your understanding of the situation on the iPad is incorrect in my article specifically about it. Please do not spread serious misinformation in the comments or I will have to remove them; this is already an issue that is confusing to many people.
  • theduckofdeath - Friday, April 29, 2016 - link

    I don't get what bigger picture I'm missing here. Yes, LCD tech has evolved a lot over the years. But, it's just the faux marketing these manufacturers always stoop to, to give the impression that they're selling something better than LCD. A few years ago it was LED now it's Quantum Dots. Both insinuating that the backlight isn't the usual old flawed edge lit design.
  • alphasquadron - Thursday, April 28, 2016 - link

    As a Windows User(not by choice but because it supports a lot of software and games), it is tiring to see the slow pace at which Windows fixes problems. When are they going to get 4k scaling done correctly. And I remember when I got my new computer and going through the same confusing ICC sub-menus to get the actual settings.

    Also what was Phillips or QD Vision thinking when they sent a reviewer of tech site that is testing their monitor for color accuracy a fake sRGB mode. I mean he just mentioned that there was no sRGB mode on the monitor so what do you think the first thing he is going to test when he gets the new monitor is. I'm still confused whether the mode actually did change something or if they are just that dumb(or they think reviewers are that dumb).
  • Murloc - Thursday, April 28, 2016 - link

    maybe they messed up while doing a quick fix. I hope.
  • Brandon Chester - Thursday, April 28, 2016 - link

    For the record, I spent a long time trying to prove to myself that it did do something. Unfortunately, if it truly were constraining the gamut it would be so completely obvious upon toggling it that you wouldn't even need to make measurements. I did measure anyway, and it truly didn't change the output at all.
  • Guspaz - Thursday, April 28, 2016 - link

    All this talk of colour management... It all works so easily on my macbook (load the profile Anand made, and everything looks correct), but on my main PC, it's a mess...

    I've got a Dell U2711 running Windows 10. That's a wide-gamut display, and I do have an ICC profile for it. The display was also factory-calibrated (it shipped with a printed report on the results).

    If I want the most trouble-free setup where most stuff looks correct, which of these is the correct approach:

    1) Set monitor to default profile and set Windows to ICC profile
    2) Set monitor to sRGB profile and set Windows to ICC profile
    3) Set monitor to default profile and set Windows to sRGB profile
    4) Set monitor to sRGB profile and set Windows to sRGB profile

    I'm guessing option 1 is correct for wide-gamut use, but the crappy Windows colour management would mess everything up. So if I want to just go for sRGB, it seems to me that option 4 is probably correct? Or is option 2 what I want?

    This is all so confusing. On my Mac I just set the ICC profile and everything works immediately and perfectly.
  • Murloc - Thursday, April 28, 2016 - link

    yeah MacOS got this down unlike Windows.

    I wonder how amenable Linux is in this regard.
  • tuxRoller - Thursday, April 28, 2016 - link

    Pretty much as good as Mac, actually.
    Checkout my comments on the recent 9.7" iPad review (the one that dealt with color management).
  • jlabelle - Friday, April 29, 2016 - link

    See my answer in page 2. I was in your EXACT same case.

    1) I guess you have a ICC profile so you are able to calibrate the screen yourself with a probe or you have a generic ICC profile from a DELL review (which means that you do not consider production variation and variatin over time) ?  this is theoretical ideal situation to take advantage of wige gamut screen…except, I do not advise it for the reason describe below.
    2) Hassle free solution : same as above but you constraint yourself with sRGB color space. You will have good color accuracy on color managed application. And even for non color managed application, and even if your ICC profile is not very good, you will have not problem of oversaturation or washed out colors.
    3) make no sense at all ! It means that you are saying that the DELL is perfectly accurate according to sRGB color space and gamut. Obviously, it cannot be further from the truth so you will end up with all your colors (EVEN for color managed applications) oversaturated. No, no, NO !
    4) This is the equivalent as what the article advice for the Philips : you put the screen in sRGB mode. You do not have any ICC display profile (because you do not have the necessary calibration equipement). So you are assuming that it is correctly calibrated and are saying to the OS that you display is perfect according to sRGB. Actually, this is the standard and you do not need to do anything to be in this situation.

    The preferred solution is by far the number 2.

    To understand why, let’s reverse the discussion and ask you (or people) why they think they benefit from a wide gamut screen ?
    • To surf the web ? No because websites are targeting sRGB anyway
    • To view pictures received by email or taken by you ? In most cases, no because mobile phone, compact cameras and even most DSLR are setup to take sRGB pictures
    • To view film ? It is slightly more complicated but anyway, there is no content with wide gamut (to make things simple) and anyway no consumer video software that would manage it. So you would end up with over saturated colors permanently. Except if this is your thing…

    So then, in which case would you have any benefits ?
    IF you have your own DSLR/mirrorless and IF you set it up in aRGB mode and IF you make always duplicates of every single picture in sRGB anyway that you want to share / display on the web / bring or sent to printing.

    And even if all those “IF” are fulfilled, you will end up having over saturated colors in most of your applications, when surfing the web, when watching pictures of others… All that just to be able to see, on your own pictures, maybe a tiny difference with side-by-side comparison in 0,001% of the case (I am not making this number, it is the proportion of pictures where I was able to spot a difference).

    Long story short : a wide gamut screen makes NO sense currently. And there is a reason why it is said that it only make sense for professional for very specific application. And those people do not come here to ask if it makes sense because they are aware of all this.

    Bottom line : choose option 2.
  • Guspaz - Friday, April 29, 2016 - link

    The U2711 was a high-end monitor, and so one of the advertised features was that Dell individually calibrated every monitor at the factory. They included with each monitor a custom calibration report that had the deltaE and such things, with graphs and whatnot. Dell provided a generic ICC profile file for the monitor, so I would imagine that the monitor itself was calibrated so that the ICC profile would match the physical monitor.

    If I pick option 2 (monitor set to sRGB, Windows set to ICC profile), then how does Windows know that the monitor is expecting the input to be in the sRGB colour space?
  • Brandon Chester - Friday, April 29, 2016 - link

    To the best of my knowledge Dell's factory calibration is at the internal LUT level so you can plug it into any device and have it be accurate (the best type of calibration). The ICC is probably just something generic and I doubt it contains a VCGT for the GPU.

    I would choose "option 4", which is to say, just leave the OS color management alone because Dell has been playing this game long enough to know that the Windows CMM doesn't work, has made your monitor usable in sRGB without having to mess with it, and given you the option to turn on Adobe RGB when you open Lightroom or some other program.
  • jlabelle2 - Wednesday, May 4, 2016 - link

    - If I pick option 2 (monitor set to sRGB, Windows set to ICC profile), then how does Windows know that the monitor is expecting the input to be in the sRGB colour space? -

    Option 2 is Option 4 with a display ICC calibration. If you are using a color managed application, it reads embedded profile and therefore will display correctly. On most of the case where it is not color managed (wall paper, Edge, modern Windows app...), the assumption is that you would use sRGB content anyway (web, pictures you received..).
    The ICC display profile ensure that you are correcting the latest inaccuracy from the Dell screen compared to sRGB color space (as, even out of the box, calibrated by Dell, it is not perfect).

    If you have no calibration probe, your best bet is option 4.
  • jlabelle - Friday, April 29, 2016 - link

    - On my Mac I just set the ICC profile and everything works immediately and perfectly. -

    For record, it does because ...it does not really take advantage of the wide gamut in your case !
  • Spunjji - Thursday, April 28, 2016 - link

    "for photographers and other professionals... the relatively low resolution poses less of a problem"

    Higher pixel density is actually huge asset - you can get a better idea of critical image sharpness without zooming in, and getting above 1080p is a massive help for getting more working area between all the toolbars.

    So really, having wide gamut /and/ high pixel density would be great. Hopefully they get on that! :)
  • Brandon Chester - Thursday, April 28, 2016 - link

    I definitely agree. Anyone who has done photo editing on a 4K or 5K display can attest to the improvement. I just meant that relative to someone who writes word documents all day, the lower resolution is probably less of an issue.
  • Spunjji - Thursday, April 28, 2016 - link

    If I could edit, I'd add thanks for the article - it was a fascinating read and I was certainly not aware that Apple had such a commanding lead in colour calibration support. Food for thought.
  • jlabelle - Friday, April 29, 2016 - link

    - I was certainly not aware that Apple had such a commanding lead in colour calibration support. Food for thought.-

    Let's be honest, having a less confusing way of setting once your display ICC profile (which anyway is done automatically by the software coming with your calibration probe) is NOT having a commanding lead in color calibration support. That is a silly statement.
  • willis936 - Thursday, April 28, 2016 - link

    I'm seeing a lot of gripes about windows color management. Doesn't argyllcms take care of that? Anyone shelling out for wide gamut should also spend the $50 for a cheap colorimeter.
  • Brandon Chester - Thursday, April 28, 2016 - link

    1. Cheap colorimeters are so inaccurate that they're basically useless.

    2. Argyll doesn't solve any of the problems. You need your OS, its frameworks, and its applications to all understand color management and work with the CMM. ArgyllCMS is basically a tool for profiling and creating ICC profiles, it can't make software understand and utilize them.
  • willis936 - Friday, April 29, 2016 - link

    I'm skeptical of your first claim without seeing data.

    As for the second that's why packages lime dispcalgui exist.
  • Brandon Chester - Friday, April 29, 2016 - link

    Again, that doesn't help the fact that software needs to support it. I think you're confusing greyscale calibration and color management here. If there was some easy way to fix color management across all Windows programs this would not be such a long standing issue.
  • UrQuan3 - Thursday, May 12, 2016 - link

    "Cheap colorimeters are so inaccurate that they're basically useless."

    I'm going to have to go with willis936 on your first comment. It sounds rather like someone driving a Ferrari saying that a Mustang has so little horsepower it is useless. To the average car owner, they're both godlike. In practice, a little $100-200 colorimeter makes a large improvement on almost any monitor. Expensive calibration for expensive monitors. Of course, use the best gear when doing a review.

    I wonder how you would review calibration tools? That does not sound easy.
  • Pork@III - Thursday, April 28, 2016 - link

    Too bad against full cover CCFL
  • Azurael - Friday, April 29, 2016 - link

    It's possible to get a 27" 2K display for $300 equivalent in Europe... I've got a Hannspree HQ271HPG which even with VAT is £200. I wouldn't say it's the best thing in the world (stuck with HDMI 1.4 & DL-DVI and hiding >1cm behind a piece of glass) but it is IPS, it calibrated up nicely (to sRGB) and the backlight consistency is much better than most cheap monitors on my sample (although it does have a bit of bleed visible at the very edges on a totally black screen.)
  • Gunbuster - Friday, April 29, 2016 - link

    I know it's a cheap monitor but dear lord, did they have to make the bezel so chunky that it looks like a 22" in photos?
  • Haravikk - Friday, April 29, 2016 - link

    Why does this include a VGA port?

    I'd also much prefer down-facing ports, and some kind of cable management, monitors that don't include these always confuse me.
  • zodiacfml - Saturday, April 30, 2016 - link

    Thanks for always including a tutorial and in-depth look of color management. I quite understand the challenges of the industry.

    You are correct that Philips should be applauded for taking the first step as this will take time to improve as OLED/AMOLED of Samsung has improved throughout the years. For now, the Philips seems useful for increasing saturation/vividness of content for entertainment.

    Questions:

    1) Isn't better for Philips to target a higher color space despite coming short for now (as conversion from a bigger space to smaller seems straightforward)? The Adobe RGB doesn't improve from the sRGB space in the "reds" where the most benefit from quantum dots can be had. I believe this primary color should be given attention as content to show this is widely available in photos such as flowers, sunsets, and red sports cars. I have seen too many red subjects looking flat like plastic.

    2) How does color spaces Rec. 2020 and Pro Photo RGB relate to each other? They seem to have the same coverage but obviously for different applications.
  • zodiacfml - Saturday, April 30, 2016 - link

    I did some reading and found the problem already which is color bit depth. What are the currently supported bit depths supported by video cards and monitors?
  • Oxford Guy - Monday, May 2, 2016 - link

    AdobeRGB is obsolete.
  • arjojosh - Monday, May 2, 2016 - link

    Very good write-up on this topic!!

    I have a Korean wide gamut monitor, 30" 1600p IPS, without an sRGB mode and it's a gigantic pain in the butt. I did loads of research prior to buying this (maybe three years ago now), and could not find any mention of the problems this would cause. I like the monitor when it's displaying correctly, but overall, it's been a big hindrance to just normal PC/computer usage.

    While it's nice to see easy to obtain, good monitors - if the result is what I've been dealing with all this time, than I recommend to stay away from this, and others like it. With no hardware based sRGB mode, you will find yourself extremely frustrated that you can't even browse online with proper photo colors.

    Hoping Microsoft starts to address this, if it's even possible, but it's gained so little traction through the years.

Log in

Don't have an account? Sign up now