Comments Locked

83 Comments

Back to Article

  • The Deutsch Thinker - Thursday, January 7, 2016 - link

    Wow, I've been waiting to see a large OLED display for so long now! Yay!
  • nathanddrews - Friday, January 8, 2016 - link

    Yeah, but look at the horrible banding in that first shot. ;-)
    http://images.anandtech.com/doci/9923/DellOLED_678...
  • dananski - Sunday, January 10, 2016 - link

    Hahaha! xD
  • ddriver - Friday, January 8, 2016 - link

    Now make a curved 21:9 one at 2x4k resolution no larger than 40 inches and I am in.
  • 8steve8 - Wednesday, February 3, 2016 - link

    21:9, ew. 16:10 please.. or even squarer.
  • ericorg - Saturday, February 20, 2016 - link

    21:9? Dear god, You don't deserve an OLED display.
  • Samus - Thursday, January 7, 2016 - link

    Dope
  • austinsguitar - Thursday, January 7, 2016 - link

    cmon! plz stop this 4k nonsense. maybe 5 years from now when modern tech is okay enough for it
    PLEASE GIVE ME MY 1440P 28 INCH OLED 120 HZ MONITOR NOW PEOPLE!!!
  • dragonsqrrl - Thursday, January 7, 2016 - link

    Hate to have to state the abundantly obvious, but something like this isn't targeted at you. This is for content creation and photo/video work where the resolution is more than welcome for editing and authoring high res content, and the hardware is already very capable of doing so.
  • austinsguitar - Thursday, January 7, 2016 - link

    I hole-heartedly understand that. but i get sick and tired of display manufacturers giving us the big stick and holding off tech that(lets be serious people would buy it in a second) technology like the monitor i mentioned from being created. 4k is retarded. its retarded in tv's and in monitors. i understand this isn't being put out for me, im not a professional detailed creator in a factory space. but the whole thing is a fad. these manufacturers are making terrible mistakes by completely jumping from 1080p to 4k. that is a massive leap and most people don't even know of any resolutions inbetween. its just sick and i feel they are really stickler-ing people into SH$@ they never will appreciate. sorry this just really bugs me.
  • daviderickson - Thursday, January 7, 2016 - link

    OLED is expensive, and thus put into premium products. Even if they made what you wanted, they'd still probably charge $4k for it, and then you'd complain about the price. Plus, if you are asking for 120hz monitor then you must be using it for twitch games, and in said games you aren't going to have time to appreciate the color and contrast improvements an OLED delivers anyway.
  • Airstuff16 - Friday, February 12, 2016 - link

    I heard that OLEDs do not have refresh rates. Im not sure if thats true but I just wanna put that out there.
  • nilepez - Thursday, January 7, 2016 - link

    I'm pretty sure most people who buy 4k Monitors want more desktop space and/or more DPI. Gaming is not the primary target of 4k. OTOH, 4K + HDR TV is going to look great....better still if they support REC 2020 (don't think anyone does yet). At this point, I have 0 interest in 1440p. I'd sooner go to 5k and run games at 1440. I like gaming, but I wouldn't buy a monitor with 100% adobe coverage for gaming.

    They're not going with mainstream products, because they want to recoup R&D costs and the best way to do that is to release an expensive monitor. It's possible that the cost to make a 1440p monitor and a 4k monitor is not enough to get the price down to a level that you'd find acceptable. My gut says it'd be at least 2-3k and I suspect that's not a price most gamers are willing to pay. OTOH, this monitors closest competitor costs significantly more than it.
  • Azethoth - Thursday, January 7, 2016 - link

    Your rant is ill informed. I am a programmer and started looking for a 4k display two years ago. The vizio was only 4kp30 so it went back because I need much faster feedback on the mouse and it was unuseable for programming. A few months later Asus came out with a fairly cheap 32" monitor that could pump out 4kp60. I am happy with that. The monitor is as wide as I want to go, more and there would be too much strain from moving my head around. The resolution is nice and with Windows 10 all the scaling issues are gone. I got a nice productivity increase over my old Apple 30" 2560x1600 monitor from 2003.

    So please note that these in between sizes you are suddenly pining for have been obsolete for more than a decade. It is high time for 4k gear to be out. 1440P OLED, gimme a break. Wake me up when I can upgrade my monitor to 4k OLED. 1440 does not even remotely cut it.
  • HollyDOL - Friday, January 8, 2016 - link

    Hm, currently using 27" and 1440p... Compared to 30" and 2160p it doesn't seem to be that huge difference in pixel density (calculated that to 4k at 30" being 1.65 times more dense than 1440p at 27")... Don't think 4k at 30" would be an overkill of any kind. I can imagine using that on regular basis.
  • ddriver - Friday, January 8, 2016 - link

    Cannot help but LOL at your comment. Because if programming is demanding in any aspect, that would be "fast mouse feedback".

    FYI programmers work with text 99% of the time, couldn't care less about refresh rates, not that 30 is low by any measure, and barely use the mouse, since that's a lot slower than using keyboard shortcuts.

    Next time pretend you are a gamer ;)
  • mr_tawan - Friday, January 8, 2016 - link

    Well programming in the modern day requires screen space for integrated tool (IDE, that is). Personally I'm pretty ok with my 24" at 1080p (but I find Surface Pro is PITA to work with).

    Not every programmer works primarily on texts. Yes if you're on the console all the time, then even 800x600 would be suffcient. If you're web programmer, having larger screen can simulate the different screen sizes that might be used by differents users. If you're a game developer, a second screen is a must (or else your debugger can block everything in full-screen mode, which is the last thing you'd want), or even better a second set of PC to do remote debugging.

    I think, in most case, multi-monitor is more useful than one large monitor. But then this is a personal preference.
  • ddriver - Friday, January 8, 2016 - link

    Code is text, 99.999% of all programming is still textual, whether or not you are using an IDE. Interaction with the IDE is an insignificant amount of all the programming. For most of the frequently used things that a regular person would use the mouse, programmers use shortcuts, such as navigating over different source files, change cursor position, select text, run project and so on.

    But you missed the point - there is nothing in programming that calls for "fast mouse feedback", also that response time doesn't really depend on the monitor framerate at all, it depends on the monitor's input lag. The point was that guy was spewing ridiculous nonsense, clearly clueless what he is talking about.
  • thetuna - Friday, January 8, 2016 - link

    I do web dev. 30Hz is feels like working through molasses, but it is not a detriment to productivity. What is a huge boon to productivity is resolution... my current ideal is a single 3440x1440 ~32".
    Why my work will not spend $600 to greatly boost my prductivity? I have no fukcing idea.
  • gochichi - Thursday, April 14, 2016 - link

    I try to spend company cash on monitors. It's curturaly discouraged. We're in the minority, $3k laptop that expires in 2-3 years... sure no prob. Awesome $500-$900 monitor that will be amazing for 3-years and great for another 3-years and then very good for another 4-years... I must be crazy, why don't I just use a $100 monitor? (This is both management and end users.)
  • twtech - Wednesday, September 5, 2018 - link

    That's sort-of true. Working in Visual Studio, I actually use the mouse quite a bit. When just writing code, I might be able to live with 30 Hz - the blur on everything when scrolling, etc., wouldn't be great, but I could probably live with it if I had to.

    Fortunately I don't have to. My home setup has aged a bit - 3x Dell 30" Ultrasharp monitors, one of which was purchased in 2007, and the other two from 2011.

    But I don't just code. I play games and do a variety of other things with my machine as well.
  • Zefeh - Friday, January 8, 2016 - link

    You say it is stupid to upgrade but if your only complaint is thinking its a fad and gimick you really haven't looked into the details.

    The biggest improvement you can make in a display in increasing pixel density. It impacts many different aspects but the biggest is ideal viewing distance(ivd). Starting with current 1080p screens the recommended IVD of a 28" and 32" are 3.7 and 4.2 feet respectively. That is the distance where you get the best picture detail from the screen. With PC monitors, one sits MUCH closer than 4 feet from there 28" screen and because of that we notice all the pixels and imperfections of the 1080p display. Here's where 4k comes in.

    The same size screen now has exactly double the pixel density. Viewing angles aside, your 28" monitor just went from 80 ppi to 160 ppi and the ideal viewing distance is much shorter as you can't see the detail unless your much closer. On some 4k monitors I've seen you need to squint about 2 inches away from the screen to see a pixel. On my current 24" 1080p monitor I can see a pixel from 2 feet away!

    Why would the manufactures be making a mistake when they are progressing towards making the perfect display? Have you actually SEEN in person these displays? Anyone who's seen the curved 4k displays in bestbuy can see the difference. Get your eyes checked or venture out of the house to a Bestbuy because while you don't like it the world does. Why would you want a excact resolution like 1440 anyway? So randomly excact...

    TL;DR - higher pixel density 4k screens = things look alot prettier alot closer than with a 1080p screen
  • wakrather - Tuesday, March 1, 2016 - link

    I agree. if youre gaming on 1080p you are like gaming from 8 years ago, or have a TN film monitor that costs 150 dollars and looks terrible for colors. 1440p would have been the best thing in televisions as well. It could have been 240hz at 1440p in no time. 2x 1080 would be enormous visual jump and lots less cost and better quality.
  • gochichi - Thursday, April 14, 2016 - link

    You're being absurd. You say "if only they released this" "if only they released that" but clearly you don't buy anything. Any appealing modern laptop has a high density display, to dock it properly to an external monitor, the external monitor should be high density as well. There are $500 27" IPS 4k displays, (I'm using it right now, the P2715Q). I still have the 27" 1440P display that preceded it, has been flawless for years. You're waiting on what exactly? A 28"... not 27"... a 1440P not a 4K screen... and of course, the whole point: OLED. I'm with you on the OLED ($5K is more than I'll ever spend on a new display technology) I'm not wish you on the waiting. I'm happy to say I stopped wasting my own time on wishing for this or that exact spec, and instead started buying the closest thing that I like. I can honestly say that the P2715Q and P2415Q are incredible at their price points and there's never been more variety of really appealing monitors and at such incredible pricing. The 2011 27" 1440P dell is still the most expensive monitor Ive ever purchased and yet the 4K displays are handily, I mean handily much more amazing. It used to be that you had to wait... I'll "wait" for OLED too... but I'll do so with dual 4k screens that are already amazing. For right now, my favorite productivity setup for the buck is 2x 24" 4K P2415Q. Dual 27's is just too much. For home use, entertaintment or whatever, i'd go with one 24, or one 27. The world is 4k ready, and we're mere months away from full smooth gaming in 4k with just one $500 video card. And if you haven't bought a 4k monitor, or a $500 video card, or a high density laptop... why should any high end monitor cater to you? Makes no sense to me.
  • Solandri - Friday, January 8, 2016 - link

    We took a huge step back in color gamut when we switched to LCD monitors and TVs. Back in the cathode ray TV days, the color space was NTSC 1953, which is very similar to AdobeRGB. The current sRGB standard was created to accommodate the vastly inferior color reproduction of cheap LCD backlights.

    http://informationdisplay.org/portals/informationd...

    So yeah, currently these bigger color spaces are only of interest to photo/video professionals. But eventually we'll move back to a bigger color gamut like NTSC or AdobeRGB for everything. If you've ever seen images on one of these wide-gamut screens, you can't wait for us to ditch sRGB. (No it doesn't mean colors will be oversaturated. That's just a side-effect of displaying sRGB images on an AdobeRGB monitor. When you display an AdobeRGB image on an AdobeRGB monitor, the colors are natural but can display much more vivid real-life hues that sRGB monitors can't display.)
  • bigboxes - Wednesday, March 9, 2016 - link

    I bought one of the last generation Panasonic plasma televisions. It will do until OLED costs come down. I waited a long time to jump into 1080p. I think I should be fine until 4K media is readily available. Now, monitors are a different thing. I will seriously consider a 32" OLED monitor if the costs near $2k.
  • NEDM64 - Thursday, January 7, 2016 - link

    This is not for gamers
  • ImSpartacus - Thursday, January 7, 2016 - link

    I think that's the issue that he's pointing out.

    It feels like an OLED-based monitor would make a mean VRR display.
  • Byte - Friday, January 8, 2016 - link

    240hz please.
  • DanNeely - Friday, January 8, 2016 - link

    240Hz would need 2 cables even with the upcoming DP1.3 standard. With DP1.2 or HDMI2 (barely available now), you'd need 4 cables. Either that or the "minimally lossy" compression standard that is being discussed as a way to fit 8k video signals down a single cable at a price that normal consumers would be willing to pay.
  • prime2515103 - Friday, January 8, 2016 - link

    That's old school man... My old NEC PF955 could do 1920x1440 at 4:3 and that came out around 2001 I believe.
  • xenol - Friday, January 8, 2016 - link

    If you're not gaming, 4K is fine even on $300 Walmart specials.

    I mean hell, my PHONE can do hardware 4K video decoding perfectly fine.
  • Spunjji - Sunday, January 10, 2016 - link

    Why not just run your 4k monitor at the lower resolution for games? I do this now, running 1440p on a 4k 24" monitor. It looks way, way better than 1080p and in terms of sharpness it is preferable to the same native resolution on a larger screen. The high DPI negates the aliasing artifacts from upscaling very well indeed.
  • 8steve8 - Wednesday, February 3, 2016 - link

    there are plenty of 27-inch 1440p monitors.
  • ericorg - Saturday, February 20, 2016 - link

    You really fell for the 120hz, meme? Fool.
  • wakrather - Tuesday, March 1, 2016 - link

    OH wow yes. You are my hero. I hate these 4k wide gamut 30" things. So useless for home use. Pretty and nice sure, but not really practical. I would says 27" personally, but i would take a 28" 1440 just as gladly. The AMD 400 series coming out this "spring" will maybe fill a 4k screen fine enough, but it will "flagship" and just another impractical financial investment. Yes, all monitors should be 1440p for next 2 years. this 4k crap is just eye roll city for monitor headlines.
  • robocow - Wednesday, March 2, 2016 - link

    Here, Here!
  • Sunrise089 - Thursday, January 7, 2016 - link

    Sadly another non-16:10 high-end display. The desktop display market seems to lag behind mobile a few years. I was starting to think I'd never get another 16:10 display, but there's plenty of aspect ratio variety in the tablet and notebook world, I just wish some of it moved back in to the 27+" desktop space.
  • NEDM64 - Thursday, January 7, 2016 - link

    At 30", I doubt you'll miss the taller screen.
  • Sttm - Thursday, January 7, 2016 - link

    The idea would be that you could fit the full 4k image that you are working on, alongside the applications toolbars on the same display. Though personally I am not a huge fan of 16:10, just as well they made it 5k for image work.
  • nagi603 - Thursday, January 7, 2016 - link

    or... you know, use a 5K display, which is a resolution specifically to allow for processing 4K material.
  • Spectrophobic - Thursday, January 7, 2016 - link

    Umm... 2160 vertical pixels is still a lot.
    I'll gladly take it over 2560 x 1600.

    Or you know, turn it in portrait mode.
  • nilepez - Thursday, January 7, 2016 - link

    For photo editing, most of my tools are on the sides, not the the top/bottom, so I'd still have to hide all those widgets to see the full image. Obviously more resolution is better, but this really isn't a deal breaker for me.
  • Impulses - Friday, January 8, 2016 - link

    You're editing 16:9 photos?
  • mr_tawan - Friday, January 8, 2016 - link

    You can rotate the monitor, and suddenly it'd appear tall enough for you :-).
  • RdVi - Thursday, January 7, 2016 - link

    When you think what the first 4K LCD monitor from Dell cost this is encouraging. If in two years they can have a 27" 1440p model for <$2000 I'd be tempted. That's if they support adaptive sync. I know these are 'professional' monitors, but that's what I've always found suits my needs best. If I game as well then can't I have the best of both? Unlike their gaming models that use gsync if they are using a top quality scaler already (which they should be) adaptive-sync should be close enough to free.
  • id4andrei - Friday, January 8, 2016 - link

    OLED has naturally high refresh rate and CRT like response time. OLED actually does not need GSync or FreeSync. It's naturally fit for gaming monitors.
  • RdVi - Friday, January 8, 2016 - link

    It still needs adaptive-sync for when frame rates cannot meet the max refresh, which right now for a 120Hz 4K display is pretty much always with new games. Further more while less noticeable at higher refresh rates, tearing would be a problem without adaptive-sync also.
  • nevcairiel - Friday, January 8, 2016 - link

    It still needs them, because otherwise the GPU is limited to sending a fixed frame rate. Both GSync and FreeSync just enable sending variable frame rate to the display.

    Now OLED might be much more suitable for such displays, surely, but it still needs either of these approaches to actually enable sending variable frame rate content.
  • Gc - Thursday, January 7, 2016 - link

    "... said that yields of OLED panels had reached 80% ..."

    For what size panel? SmartWatch displays? Phone displays? Yield is a function of defects per area, so yield percentage means little without knowing display area (and maybe pixel size).
  • lilmoe - Thursday, January 7, 2016 - link

    "For what size panel?"

    I'm pretty certain he was talking about TVs, which is 48"+
  • iwod - Thursday, January 7, 2016 - link

    Pretty much every normal size you would expect from Phone to TV. LG WOLED, while technically, or i should say theoretically speaking inferior to the way Samsung does OLED, do provide 90% of OLED benefits at 30% of the cost.

    I dont think i can tell the difference of the extra color space provided.
  • Sancus - Thursday, January 7, 2016 - link

    Dell claims this is 120hz but it's not clear what connectivity options allow that refresh rate. Is this going to support Displayport 1.3?
  • Guspaz - Friday, January 8, 2016 - link

    They state HDMI and USB-C, which is DisplayPort. DisplayPort 1.3 can do 4K120. It'll take all four USB 3.1 lanes to do it, though, so the only other things running on that cable at the same time would be USB 2.0 and the 100W of power.
  • Sancus - Friday, January 8, 2016 - link

    Sure, that's my point. This is supposed to come out in March, but no video cards and no products support DP1.3 -- so it's very unclear how exactly you are supposed to run this monitor at 4K@120hz. If this is gonna be a DP1.3 monitor it would be nice to know that explicitly, and it'd also be nice to know where a DP1.3 video card is going to come from and when.
  • DanNeely - Friday, January 8, 2016 - link

    Dual DP1.2 cabling (similar to the current status for 5k60) would be a fallback option for 4k120. OTOH it's possible that only DP1.3 will unlock 120hz mode and that this will be a case where the monitor is out before the GPU capable of driving it at max settings. How big a deal that is comes down to if Pascal and Arctic Islands end up supporting DP1.3 or not. Assuming that they do, a few months of timing mismatch isn't a product killer.
  • Nintendo Maniac 64 - Thursday, January 7, 2016 - link

    Considering that a black pixel is an "off" pixel, wouldn't the static contrast and dynamic contrast on an OLED display be exactly the same?
  • az060693 - Thursday, January 7, 2016 - link

    I'm happy OLED is finally taking off but this would present some interesting problems in real-world usage. OLED doesn't seem to do very well with grays and shadow detail, and the burn-in is a bigger concern on desktops than tv's, with static elements like the taskbar constantly present. Pretty curious to see a review of this when it's out.
  • MTEK - Friday, January 8, 2016 - link

    Exactly. The user presence sensor seems like a hack, and the pixel shifting logic raises other questions, like, is it noticeable? I can go long hours with application toolbars just sitting there, not moving.
  • Murloc - Friday, January 8, 2016 - link

    maybe they bank on pixel density being high enough to not make it noticeable?
  • althaz - Thursday, January 7, 2016 - link

    I'm throwing my money at the screen but nothing is happening.
  • LordanSS - Friday, January 8, 2016 - link

    I sure wish to be able to own one of this kind eventually.

    My first experience with OLEDs was when I bought my Vita... that screen blew my mind at the time.
  • imaheadcase - Friday, January 8, 2016 - link

    Honestly $5k is cheaper than i thought it was going to be. That is within range for lots of people considering dell credit you can pay that off in a year.

    The whole burn in thing though is what is scary, as long as it has a great warranty against it i would actually buy one March 31.
  • Laxaa - Friday, January 8, 2016 - link

    As a content creator, I want this display.
  • Pinko - Friday, January 8, 2016 - link

    Monitor is fantastic but where are the O.S. that can support it? Not to talk of programs ... the computing world is still 8 bits per channel ...
  • Solidstate89 - Friday, January 8, 2016 - link

    What OS doesn't support 10bpc?
  • Pinko - Friday, January 8, 2016 - link

    OSX support of 10bpc is in early stages by the way of Metal ... Windows has 10bpc but which apps are using it?
  • Guspaz - Friday, January 8, 2016 - link

    Content creation apps are. Photoshop, Lightroom, Premiere, etc.
  • Pinko - Sunday, January 10, 2016 - link

    Definitely not Lightroom,iIt does NOT work in 10bpc. PS works in 10bpc only for some actions/tools. I don't know deeply for Premiere, because I don't use it. Anyway not in OSX, this is Windows only, and only with specific Quadro cards. as I said another post ... it's not mainstream yet.
  • mr_tawan - Friday, January 8, 2016 - link

    DirectX supports up to 16-bit per channel, or 32-bit floating point per channel for texture format. OpenGL even support 32-bit per channel.

    It's up to the application developers to use it then :-).
  • Pinko - Friday, January 8, 2016 - link

    that what I meant. DirectX supports 10bpc but no application is using them. Same for OpenGL. Developers don't use them as application dev platform. It's been an epic fail. Of course I'm NOT talking of game's world.

    And most of GPU drivers don't support more than 8bpc either. Only some professional cards does really use 10bpc, e.g. Quadro from Nvidia. But then professional applications do NOT render images in 10bpc.

    NVIDIA Geforce graphics cards have offered 10-bit per color out to a full screen Direct X surface but, due to the way most applications use traditional Windows API functions to create the application UI and viewport display, this method is NOT used for professional applications such as Adobe Premiere Pro and Adobe Photoshop. These programs use OpenGL 10-bit per color buffers which require an NVIDIA Quadro GPU with DisplayPort connector. But the cover of 10bpc output is still limited.

    30 bit workflow is far from being mainstream at the moment.
  • Nintendo Maniac 64 - Friday, January 8, 2016 - link

    Ehem:
    https://www.youtube.com/watch?v=X-MXyvD3iRA
  • Pinko - Sunday, January 10, 2016 - link

    ehem! it's only the driver... and the apps??
  • DesktopMan - Friday, January 8, 2016 - link

    "an unknown refresh rate (yet, it should be very high)", depends, might not even be enough bandwidth for 120Hz depending on what signalling standard it uses.
  • Shadowmaster625 - Friday, January 8, 2016 - link

    What is this, 2013? For $5K this should be an 8K panel.
  • Hulk - Friday, January 8, 2016 - link

    With the burn in issue of OLED and the really high price I'm wondering if LCD with large array backlighting to increase contrast ratio but be a very competitive technology well into the future. The technology is proven and has become very cheap. OLED has been "just over the horizon" for quite a few years now.
  • Magichands8 - Friday, January 8, 2016 - link

    This looks like a very nice monitor but this has already been done before by Flanders and Sony. Although it's nice to see OLED becoming more common at competitive prices. Still though, I'd probably go with an Eizo CG318-4K if I had to choose between the two :) The burn-in issues with OLED scare me too. I would be willing pay this much for this caliber of monitor but every two or three years due to burn in issues.
  • bill44 - Friday, January 8, 2016 - link

    Wake me up when it's half price, supports HDR and 100% P3. Video editor will need this. LG 2016 OLED TVs will have 99% P3 (no tested yet), HDR is here to stay. There will be PowerDVD and other UltraHD BD playback software with HDR support sometimes in 2016.
    If, if it supports 120Hz, does this means 3D stereoscopic.
  • Wolfpup - Friday, January 8, 2016 - link

    I like Dell's monitors, but this needs way more ports, and I'm pretty iffy on OLED period. People praising OLED usually seem clueless that LCD isn't like one type of panel, but bajillions of possible qualities of panel, many of which are extraordinarily good...and they have virtually no downsides and last effectively forever.

    Even within a type of tech, quality ranges massively. Like TN, thought of as the worst (modern) type of LCD panel, can vary from really bad, to very high quality. I switched from an MVA/PVA panel years ago to a really good TN panel, and while I'd love something better still, it's got an awesome selection of ports that I use daily and that's hard to find. (It's a Dell too, as was my previous one.)
  • poohbear - Friday, January 8, 2016 - link

    make one 1440p (or heck even 1080) with 120hz+ and I'll spend up to $1500 on it! 4k simply isn't feasible for most gamers.....and if i were to choose between OLED & 4k i'd DEFINITELY prefer the OLED no questions ask.
  • RaistlinZ - Friday, January 8, 2016 - link

    Do want.
  • Tobarus - Saturday, January 9, 2016 - link

    I, for one will definite be buying this. The only thing that comes close is Ezio's true 4K display (not UHD) and that costs even more! Yes, OLED does have some serious draw backs, but the specs alone far and away blow any "competition" out of the exosphere. This monitor will most definitely not be sitting in a dorm room somewhere (the only "clientele" complaining).

    The only drawback I see in this monitor for myself is the timing, as in the exact same time frame I bought the Nikon D5! As for all the other "drawbacks", simple preventative measures in place will ensure years of use from this.

    Finally, a company is stepping up to the plate and taking a risk by offering something truly innovative and game changing. I don't think I'll miss my NEC, even though this new monitor is double the price. A bit of a cliche, but reminds me of when the first iPhone came out.

    Well done Dell (and LG)!
  • Sivar - Monday, January 11, 2016 - link

    Other disadvantages of OLED vs LCD:

    - OLED typically cannot reach the brightness levels of an LCD screen, and the higher OLED's brightness setting, the more quickly they degrade.

    - OLED's colors deteriorate at different rates. Blue typically deteriorates at roughly 3x faster than green, with red somewhere in the middle. This can be somewhat mitigated with software that increases each color's intensity relative to its expected decay rate, but the increase brightness will only kill the display faster.
    I suspect some companies will experiment with having 2 blue pixels for each red/green, though this can have its own problems.

    Still, these problems will be resolved like any other and OLEDs will likely, finally overtake LCD screens as the display of choice. Their fantastic contrast ratio and color accuracy are far ahead of anything LCDs can do even in the most experimental of lab settings.

Log in

Don't have an account? Sign up now