Probably, but it's a very tempting screen for me nonetheless. I feel 27" is the sweet spot for my personal uses, but a slightly larger screen would not be a problem.
I have a little bit of a hard time with 110ppi on the 27" monitors, my vision is about 20/35. This screen might need to be scooted back a little. I would rather 30" was the norm for 1440p.
It's 92 DPI vs 110 for a 27" 1440p monitor, or the 100 DPI that's generally used as the baseline for comparison at 1x. That makes it about 4" wider than the more common 27" variant and 2.5" taller. a 30" 1600p monitor is about the same height and midway between the two in width. I've owned one of the latter for several years and have been quite happy with it over that timespan. I keep it at about the same distance as I the 22" screens I use at work. Most non-image/map web pages only fill the middle half while showing about 50% more vertically; but for anything that can scale to full width it works wonderfully. I don't think the slightly larger width would be a problem; and the lower DPI would be a bit easier on someone with less than perfect vision or who just prefers larger text.
@DanNeely: " That makes it about 4" wider than the more common 27" variant and 2.5" taller. a 30" 1600p monitor is about the same height and midway between the two in width."
What is this 1600p you are referring to. 720p -> 1280x720 1080p -> 1920x1080 1440p -> 2560x1440 2160p -> 3840x2160 (Commonly referred to as 4K) These whole "p" nomenclature refers to the vertical resolution of 16:9 displays. 1600p would correspond to a resolution of 2844.44x1600. I've never heard of that and wikipedia doesn't have an entry like that in their display resolution page either, so chances are fair that it isn't available for general consumption. Perhaps you were referring to the (in my opinion superior) 16:10 resolution 2560x1600. A quick wag at the calculations say that would fit your dimension description. In which case, nice find. Its hard to find a good, reasonably priced 16x10 monitor anymore (especially if you want "sync" technologies or high refresh rates). Even the newer 10bit 27" professional monitors from NEC have gone 1440p.
@Mondozai >I grew up in an age of YouTube. Therefore, all my firsthand learned knowledge comes from YouTube and I am always correct and win all internet arguments.
He's also correct. Before Youtube was 16:9 (which it is now, videos that aren't look very out of place), they didn't have 240p/360p/480p (all of which are 16:9 resolutions, btw, though the definitions for them could ) as options - their first quality options didn't directly mention resolution at all.
The "p" in 1080p &c come from digital video production standards. 720p is 720 vertical lines, progressive scan. 1080i is 1080 vertical lines, interlaced scan. The full standard also defines frame or field rate (frames for progressive, fields for interlaced), e.g. 1080p30 or 720i50.
As Official Standards From TV People it's just 480, 576, 720, 1080, plus i or p and a frame/field-rate. As a matter of trademark/marketing, only 1080p (a.k.a. FHD/Full HD) and 720p (HD) ever gained significant foothold in terms of mindshare. But it's trivial to extend this naming scheme to other resolutions, and virtually all computer images will be p-for-progressive. So 1440p and 1600p "work." They're just not rigidly defined.
Plus, of course, vertical-progressive shorthand leaves out differing aspect ratios with the same vertical measurements, like you could describe 3440x1440 ultrawide monitors as "1440p". The reason why 1080p is generally unambiguous to mean 1920x1080 rather than, say, 2560x1080, is because 1080p is a defined and trademarked thing.
I appreciate his passion for 16:10 but these monitors are consumer class displays. 16:10 is reserved for professional and some prosumer displays. It's almost impossible to find a modern 16:10 display for under $1000 let alone something in the 30" class would cost you $2000. These cost a quarter that much.
It's not just the desktop monitor, but the video output devices as well. Some devices can do 720p or 1080i, for example, and while desktop monitors will perform deinterlacing, it does add additional processing time for the monitor (meaning lag).
The interlaced and progressive initials still mean something in certain applications. It's not as important to consumers as it was 10 years ago, but still needed in an age where not all devices output or display progressive scan for every resolution they support.
The 'p' is simply from 'progressive(-scan)' and isn't specific to 16:9 resolutions. Back in the early 2000s when graphics cards with video capture capability were all the rage and 16:9 on computers was still practically unheard of the capture resolutions were generally still only listed, if in a short version, by their vertical resolution.
Don't forget that so many internet commentators were born right about the time 16:10 was dying. 4:3 is the stuff of legends and 16:10 was maybe a short lived fluke. This being said I miss my old HP LP3065 - power hungry, hot like an oven but at least it gave me some vertical real estate. The bigger screen get the more I hate 16:9. And seeing that 21:9 is becoming more and more popular get me really depressed :).
The actual screen surface gets smaller at the same diagonal the higher the AR.
LOL at first read, you make me think of 6 year olds blogging because they would have been born in 2010 around the time 16:10 started exiting the consumer market and was delegated to the professional sector.
A decade ago, most widescreen monitors from Dell, HP, NEC and Samsung were 16:10, including some laptops. They were often 1440x900 or 1680x1050, only the high end or 24"+ models were 1920x1200. Great aspect ratio, but obviously more expensive to produce.
What? The "p" denominator is actually a holdover from back when displays were progressive (p) or interlaced (i). Cable still tends to be 1080i, for example. It is not specific to 16:9 aspect ratios, as exemplified by 480p, which is the shorthand for 640x480, at 4:3 ratio.
1600p isn't 16:9, but that doesn't make it an invalid or ambiguous term.
No, it's not. 480p is usually 640x480, as it was most commonly in use in the late 90's and early turn of the century with game consoles such as GameCube, PS2, Xbox 360, DVD format movies, etc.
>These whole "p" nomenclature refers to the vertical resolution of 16:9 displays
That's where you're wrong, millennial.
Back before you were born, there were these things called "CRT Televisions", or Cathode Ray Tube Televisions. For consumers, these typically supported 240p and 480i resolutions, or 320x240 resolution at ~60hz, progressive scan, or 640x480 resolution at ~60hz, interlaced scan, respectively.
The "p" signifies that it's progressive scan, meaning each refresh cycle the entire frame is redrawn (as opposed to half the screen, with alternating horizontal lines, during interlaced scan) and nothing more. The number before the "p" discloses the verticle frame resolution. 1600p = 2560x1600 progressive scan.
Well said. Also, I'm kind of amazed that we're even having this P/I and resolution conversation. Why don't people google stuff and learn from the vast internet instead of ignorantly saying things in forums?
That said, I kinda miss my 17" Syncmaster. Only 768p (that's 1024x768 for the young kids, not 1280x768) but even at just 75hz it looked glorious.
"p" used to refer to progressive, which was a time when there was also "i", or interlaced. Since we mostly all use LCD panels these days (other than a few extremely expensive (high end) or extremely cheap (ultra low end)), the "p" doesn't really mean anything anymore.
No way. For old people and people with bad eyesight, this is a great display. My dad uses a 32" 1080p VA TV as a monitor, 1) because it was $150 on sale, and 2) he can actually see the screen. In truth, it looks decent at normal distance. Pixelated, but nice.
IMO, bigger is usually better, even if pixel density isn't at the ordained, appropriate arc-minute proximity. The human brain can endure quite a bit when it is being wowed by content filling the scope of vision.
That said, I would spring for the FreeSync version if I were in the market.
First get a high quality picture from the web, PNG ideally and preferrably one with high contrasts and sharp edges, now open up GIMP (or any other image manipulation program) and reduce the resolution by half for each axis (i.e. turn a 1920*1080 picture into a 960*540 picture) save that picture as a seperate copy. Now open both copies and place each on one half of the screen, enhance the zoom factor on the smaller picture to 200% so that both are exactly the same size, also try align the exact same section of the picture on both. Now step back a considerable distance, so you can't make out any difference between the 2 pictures whatsoever, and slowly walk towards the monitor, stop as soon as you notice a difference between the pictures (can also make a "blind test" by having a friend realign the pictures when you're not looking.. no fooling yourself that way) now measure the distance from your eyes to the screen.
Divide that distance by two (since you enlarged the pixelated picture by two thus dividing its DPI by 2) and you've got your optimal viewing distance for that screen resolution and screen size. If you did that for a 1080p screen and want to know the result for a hypothetical UHD screen of the same size just divide the distance by 2 another time (since vertical and horizontal resolution increases twofold with UHD) You can also extrapolate linearly to larger screen sizes.
So if you had a distance of say 5m for a 30" screen with FHD, divide that by 2-> 2.5m, now extrapolate it to e.g. 55" -> 2.5 * 55/30 = 4.6m now divide by 2 for UHD -> 2.3
I've always liked this method but on larger "4K" displays, it has me sitting too close and causes my head to move too much. The purpose of the "divide dpi" method is to set you back just far enough so your eyes do the moving, not your neck muscles.
Basically my rule of thumb is follow the method and add the length of your chair seat to the distance as a buffer.
Until GPU technology advances a little more, I'd rather not drive all the pixels in a 4K screen. There's nothing wrong with doing so, but 1440 is probably okay for a lot of people out there and the price certainly has a lot of appeal.
I want it for desktop uses, "GPU Technology" has been able to drive 4K at the desktop for years. I have a 15" laptop with a 4K screen already and it has no problems drawing my Visual Studio window. There is little constructive use for a huge screen like this with such a low resolution.
@CoreyWar: "PPI 27in at 2560 × 1440 108ppi, 32in 91ppi (not a huge difference)"
It is almost exactly the same pixel density you get from a 1080p 24" monitor. I consider this difference pretty notable and easily differentiable. However, that doesn't make it bad. It is certainly workable.
When ever I return from my 25" QHD to my 24" 1920x1200 at work I'm amazed at how large and clumsy everything looks. 92 ppi would definitely be too low for me.
How about that it is made by a decent vendor? Monoprices monitors don't review all that well. Personally I am popping a chub on this, I have a qhd 27" which is awesome but I have to use scaling, a 32 would probably mean I don't need to and could move my 27" to my secondary display. I game but 60hz hasn't been a problem for me yet so while higher refresh would be nice it isn't a big deal to me.
Neither LG nor HP manufacture the panels, so ultimately the quality of the panel isn't determined by who puts the monitor together. As an example, my Dell U2711 uses the same identical panel as the contemporaneous Apple Cinema Display.
Of course, the controller that the vendor connects up to that panel can make a rather large difference: things like gsync or freesync are purely attributes of the controller rather than the panel.
LG does make some LCD panels, although I don't know if they use their own exclusively. The quality of the panel used is however determined by the OEM when they decide which panel family to buy from and which grade of panel from family to use (panels are binned for consistency and the number of dead/stuck pixels just like CPUs are binned for speed). Panel quality is one of the reasons why "Korean"/Monoprice 1440p monitors are so much cheaper than Dell/HP/etc. The former use the low bin panels normally put into digital signage.
Sorry, I meant to say "Neither Monoprice nor HP". I was thinking of some companies that make panels (LG, AUO, Samsung, etc) which is probably why I accidentally typed LG.
The quality is most certainly determined by who puts the monitor together. The monitor is not just a panel, the electronics driving it are just as important.
Needs 4K at this size. 2560x1440 at 32 inches is just 92 PPI which is quite low, even when accounting for a larger viewing distance versus a 27 inch monitor.
This looks pretty nice. I've always liked VA, but am a gigantic fan now that I've seen how TERRIBLE IPS is on a TV. I didn't realize how massively better VA's contrast is than IPS. I guess it's not something you notice on a phone or something, but on a TV (and I'm sure it would apply to monitors as well)...night and day. Try watching Lucifer on an IPS TV :-O VA there's piles of color and contrast in all those dark colors. IPS it's just a smeary black mess.
This new monitor seems similar to the BenQ BL3200PT I'm currently looking at (the BenQ is slightly more expensive but has a USB3 hub, a card reader, a set of speakers, VGA/DVI/HDMI/DP inputs and a few additional features).
The pixel pitch is basically the same as the one of 1980x1200 24" monitors... good enough for some of the latest Eizo CG series at over £1000. Assuming this HP product is as good as the BenQ, it is a valid product in today's world.
Yup, this is the one regret I have about purchasing my LG instead of a Vizio. Colors look great in lighter images, but in dark ones the contrast goes out the window, and holy hell did somebody say backlight bleed?
I can't wait until HDR TVs are available for around $600, the LG will be demoted to secondary TV and the HDR will be come the main. Hopefully a VA panel with local dimming and P3 colors!
I would luv a 27-30 2560x1600 16:10 screen much better than the 16:9 format. Well not much but it is nice having that little bit of extra height in the screen for sure. My projector is 16:10 & I can assure you the extra pixel height is much better for gaming. I really notice the 16:9 make it seem like squishy view when I watch a TV show that is 16:9 format. So if they could come out with a 16:10 Panel with current gen screen tech & a handy price like this HP unit I would be all over it for sure.
Whatever happened to all of those fancy touch screen desktop monitors that Windows 8 (and now Windows 10) was supposed to inspire with elaborate desktop stands that allowed it to pull down flat when needed? Aka the stand-alone monitor version of the Lenovo A720 Desktop all-in-one.
Because they realize that even if its shown on sci-fi movies, touchscreen on big screens are annoying when its more comfortable to use a mouse/keyboard type of pshchical input.
Not necessarily. I still have and use my Dell ST2220T and use the touch functionality every now and then. It's perfectly comfortable as a complement to my mouse and keyboard.
Still rocking my Dell 3007WFP from 10 years ago. Best $1100 I have ever spent on computers. Bridged many PC upgrade cycles. No internal display conversion hardware and thus no input lag.
Would love to get a 4K equivalent. Feels like the market is getting there. But this HP isn't it.
I've got a similar era NEC 3090. Still looks great and my biggest future proofing concern is that the lack of a displayport input might bite me for $70ish in the future (the cheaper adapters top out at 1920x1200). It's possible I might end up with an ~32" 4k display at some point in the future; but what I'm really lusting after is the 31.5" 5k panel LG's been working on for the last year and change (almost as tall as my 1600p panel and at 186dpi 2:1 fallback scaling is more reasonable). It initially leaked in January with an expected release at the end of last year; it's slipped since then with TFT Central's panel database now calling for availability in the middle of this year. OTOH until this years GPUs launch gaming at 5k's more theoretical than a practical option.
It's good that 2560x1440 is becoming more prevalent after so many years of stagnation at 1920x1080, but really 4K should be the standard at anything above 27".
my crt did 1920x1440 15 years ago. resolutions went down with lcd, then went down further from 1920x1200 to 1920x1080 etc. then someone had the bright idea that lots of low end users should use 1366x768.
the biggest caveat i see is that it's only 60hz. the cheap korean panels do 96hz easy, and once you go back to higher refresh rates you realise how frustrating 60hz was. (used to use 85 hertz on crt)
I'm more interested in 21:9 1440p IPS displays... and affordable one... don't care about curved or not as long as it doesn't have a skyscraper price tag... I hope when these comes out HDMI 2.0 and DP 1.3/1.4 becomes the standard by then...
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
74 Comments
Back to Article
Murloc - Wednesday, May 4, 2016 - link
isn't this just too big for the average viewing distance of common mortals?Yuriman - Wednesday, May 4, 2016 - link
Probably, but it's a very tempting screen for me nonetheless. I feel 27" is the sweet spot for my personal uses, but a slightly larger screen would not be a problem.javishd - Wednesday, May 4, 2016 - link
I have a little bit of a hard time with 110ppi on the 27" monitors, my vision is about 20/35. This screen might need to be scooted back a little. I would rather 30" was the norm for 1440p.kpxgq - Sunday, August 28, 2016 - link
Completely opposite here... I am using a 27" 1440p right now and wishing for a 4k 25-28". I can definitely see subpixels from 2 feet away.DanNeely - Wednesday, May 4, 2016 - link
It's 92 DPI vs 110 for a 27" 1440p monitor, or the 100 DPI that's generally used as the baseline for comparison at 1x. That makes it about 4" wider than the more common 27" variant and 2.5" taller. a 30" 1600p monitor is about the same height and midway between the two in width. I've owned one of the latter for several years and have been quite happy with it over that timespan. I keep it at about the same distance as I the 22" screens I use at work. Most non-image/map web pages only fill the middle half while showing about 50% more vertically; but for anything that can scale to full width it works wonderfully. I don't think the slightly larger width would be a problem; and the lower DPI would be a bit easier on someone with less than perfect vision or who just prefers larger text.BurntMyBacon - Wednesday, May 4, 2016 - link
@DanNeely: " That makes it about 4" wider than the more common 27" variant and 2.5" taller. a 30" 1600p monitor is about the same height and midway between the two in width."What is this 1600p you are referring to.
720p -> 1280x720
1080p -> 1920x1080
1440p -> 2560x1440
2160p -> 3840x2160 (Commonly referred to as 4K)
These whole "p" nomenclature refers to the vertical resolution of 16:9 displays. 1600p would correspond to a resolution of 2844.44x1600. I've never heard of that and wikipedia doesn't have an entry like that in their display resolution page either, so chances are fair that it isn't available for general consumption. Perhaps you were referring to the (in my opinion superior) 16:10 resolution 2560x1600. A quick wag at the calculations say that would fit your dimension description. In which case, nice find. Its hard to find a good, reasonably priced 16x10 monitor anymore (especially if you want "sync" technologies or high refresh rates). Even the newer 10bit 27" professional monitors from NEC have gone 1440p.
ann_idiot - Wednesday, May 4, 2016 - link
Typical 30" monitor is 2560X1600. That's where the 1600P is coming from.rascalion - Wednesday, May 4, 2016 - link
2560x1600 is 16:10, not 16:9. 16:9 is the ratio used with the 1080p, 1440p , etc. resolutions.Murloc - Wednesday, May 4, 2016 - link
nope, what about 480p or 360p?Go to youtube and click on the settings menu and they're all there to see.
Mondozai - Thursday, May 5, 2016 - link
That's because those are also 16:9 resolutions, dummy. All of YouTube is 16:9.JoeyJoJo123 - Thursday, May 5, 2016 - link
@Mondozai>I grew up in an age of YouTube. Therefore, all my firsthand learned knowledge comes from YouTube and I am always correct and win all internet arguments.
althaz - Thursday, May 5, 2016 - link
He's also correct. Before Youtube was 16:9 (which it is now, videos that aren't look very out of place), they didn't have 240p/360p/480p (all of which are 16:9 resolutions, btw, though the definitions for them could ) as options - their first quality options didn't directly mention resolution at all.Factory Factory - Friday, May 6, 2016 - link
The "p" in 1080p &c come from digital video production standards. 720p is 720 vertical lines, progressive scan. 1080i is 1080 vertical lines, interlaced scan. The full standard also defines frame or field rate (frames for progressive, fields for interlaced), e.g. 1080p30 or 720i50.As Official Standards From TV People it's just 480, 576, 720, 1080, plus i or p and a frame/field-rate. As a matter of trademark/marketing, only 1080p (a.k.a. FHD/Full HD) and 720p (HD) ever gained significant foothold in terms of mindshare. But it's trivial to extend this naming scheme to other resolutions, and virtually all computer images will be p-for-progressive. So 1440p and 1600p "work." They're just not rigidly defined.
Plus, of course, vertical-progressive shorthand leaves out differing aspect ratios with the same vertical measurements, like you could describe 3440x1440 ultrawide monitors as "1440p". The reason why 1080p is generally unambiguous to mean 1920x1080 rather than, say, 2560x1080, is because 1080p is a defined and trademarked thing.
Samus - Friday, May 6, 2016 - link
I appreciate his passion for 16:10 but these monitors are consumer class displays. 16:10 is reserved for professional and some prosumer displays. It's almost impossible to find a modern 16:10 display for under $1000 let alone something in the 30" class would cost you $2000. These cost a quarter that much.Margalus - Saturday, May 7, 2016 - link
I see a lot of 4:3 videos on youtube, along with other strange aspects. It is definately not all 16:9tamalero - Monday, May 9, 2016 - link
id say different standards.like those old VESA and VGA standards vs the new ones.
Mr Perfect - Thursday, May 5, 2016 - link
The P just means it's progressive scan, rather then I for interlaced scan. They don't have any bearing on aspect ratio.IMO it's actually kind of silly to append the P on everything, you'd be hard pressed to find an interlaced desktop monitor.
JoeyJoJo123 - Thursday, May 5, 2016 - link
It's not just the desktop monitor, but the video output devices as well. Some devices can do 720p or 1080i, for example, and while desktop monitors will perform deinterlacing, it does add additional processing time for the monitor (meaning lag).The interlaced and progressive initials still mean something in certain applications. It's not as important to consumers as it was 10 years ago, but still needed in an age where not all devices output or display progressive scan for every resolution they support.
TheoLu - Friday, May 6, 2016 - link
The 'p' is simply from 'progressive(-scan)' and isn't specific to 16:9 resolutions. Back in the early 2000s when graphics cards with video capture capability were all the rage and 16:9 on computers was still practically unheard of the capture resolutions were generally still only listed, if in a short version, by their vertical resolution.They're not wrong in listing '1600p'.
https://en.m.wikipedia.org/wiki/480p
DanNeely - Wednesday, May 4, 2016 - link
1600p was used as shorthand for 2560x1600 displays before anyone started selling 2560x1440 ones.Thank you, try again.
close - Thursday, May 5, 2016 - link
Don't forget that so many internet commentators were born right about the time 16:10 was dying. 4:3 is the stuff of legends and 16:10 was maybe a short lived fluke.This being said I miss my old HP LP3065 - power hungry, hot like an oven but at least it gave me some vertical real estate. The bigger screen get the more I hate 16:9. And seeing that 21:9 is becoming more and more popular get me really depressed :).
The actual screen surface gets smaller at the same diagonal the higher the AR.
Samus - Friday, May 6, 2016 - link
LOL at first read, you make me think of 6 year olds blogging because they would have been born in 2010 around the time 16:10 started exiting the consumer market and was delegated to the professional sector.A decade ago, most widescreen monitors from Dell, HP, NEC and Samsung were 16:10, including some laptops. They were often 1440x900 or 1680x1050, only the high end or 24"+ models were 1920x1200. Great aspect ratio, but obviously more expensive to produce.
close - Friday, May 6, 2016 - link
Well judging by the amazement at the "1600p" I guess they could have been 6 at the time :).bug77 - Friday, May 6, 2016 - link
Hear, hear!Friendly0Fire - Wednesday, May 4, 2016 - link
What? The "p" denominator is actually a holdover from back when displays were progressive (p) or interlaced (i). Cable still tends to be 1080i, for example. It is not specific to 16:9 aspect ratios, as exemplified by 480p, which is the shorthand for 640x480, at 4:3 ratio.1600p isn't 16:9, but that doesn't make it an invalid or ambiguous term.
abhaxus - Thursday, May 5, 2016 - link
480p is usually 720x480. The whole p business is kinda dumb.JoeyJoJo123 - Thursday, May 5, 2016 - link
No, it's not. 480p is usually 640x480, as it was most commonly in use in the late 90's and early turn of the century with game consoles such as GameCube, PS2, Xbox 360, DVD format movies, etc.mdriftmeyer - Friday, May 6, 2016 - link
Correct. It was when the defacto monitor standard was 1024 x 768 and DOOM came with games running as 640 x 480.djboxbaba - Wednesday, May 4, 2016 - link
are you high?MmmCake - Wednesday, May 4, 2016 - link
The "P" refers to the fact that it is a progressive signal as opposed to an interlaced one.JoeyJoJo123 - Thursday, May 5, 2016 - link
>These whole "p" nomenclature refers to the vertical resolution of 16:9 displaysThat's where you're wrong, millennial.
Back before you were born, there were these things called "CRT Televisions", or Cathode Ray Tube Televisions. For consumers, these typically supported 240p and 480i resolutions, or 320x240 resolution at ~60hz, progressive scan, or 640x480 resolution at ~60hz, interlaced scan, respectively.
The "p" signifies that it's progressive scan, meaning each refresh cycle the entire frame is redrawn (as opposed to half the screen, with alternating horizontal lines, during interlaced scan) and nothing more. The number before the "p" discloses the verticle frame resolution. 1600p = 2560x1600 progressive scan.
mdriftmeyer - Friday, May 6, 2016 - link
Also spot on.euskalzabe - Thursday, May 12, 2016 - link
Well said. Also, I'm kind of amazed that we're even having this P/I and resolution conversation. Why don't people google stuff and learn from the vast internet instead of ignorantly saying things in forums?That said, I kinda miss my 17" Syncmaster. Only 768p (that's 1024x768 for the young kids, not 1280x768) but even at just 75hz it looked glorious.
erple2 - Friday, May 6, 2016 - link
"p" used to refer to progressive, which was a time when there was also "i", or interlaced. Since we mostly all use LCD panels these days (other than a few extremely expensive (high end) or extremely cheap (ultra low end)), the "p" doesn't really mean anything anymore.StormyParis - Wednesday, May 4, 2016 - link
Indeed. I'm moved to several screens instead of a larger one. Depends on your work I guess, but for me, it's both cheaper and more productive.nathanddrews - Wednesday, May 4, 2016 - link
No way. For old people and people with bad eyesight, this is a great display. My dad uses a 32" 1080p VA TV as a monitor, 1) because it was $150 on sale, and 2) he can actually see the screen. In truth, it looks decent at normal distance. Pixelated, but nice.IMO, bigger is usually better, even if pixel density isn't at the ordained, appropriate arc-minute proximity. The human brain can endure quite a bit when it is being wowed by content filling the scope of vision.
That said, I would spring for the FreeSync version if I were in the market.
Murloc - Wednesday, May 4, 2016 - link
old people, okay, but I aim at not being able to see the pixels.Arnulf - Thursday, May 5, 2016 - link
Don't worry, nature will take care of that for you sooner or later.Agent Smith - Thursday, May 5, 2016 - link
Determine Viewing DistanceFirst get a high quality picture from the web, PNG ideally and preferrably one with high contrasts and sharp edges, now open up GIMP (or any other image manipulation program) and reduce the resolution by half for each axis (i.e. turn a 1920*1080 picture into a 960*540 picture) save that picture as a seperate copy.
Now open both copies and place each on one half of the screen, enhance the zoom factor on the smaller picture to 200% so that both are exactly the same size, also try align the exact same section of the picture on both.
Now step back a considerable distance, so you can't make out any difference between the 2 pictures whatsoever, and slowly walk towards the monitor, stop as soon as you notice a difference between the pictures (can also make a "blind test" by having a friend realign the pictures when you're not looking.. no fooling yourself that way) now measure the distance from your eyes to the screen.
Divide that distance by two (since you enlarged the pixelated picture by two thus dividing its DPI by 2) and you've got your optimal viewing distance for that screen resolution and screen size.
If you did that for a 1080p screen and want to know the result for a hypothetical UHD screen of the same size just divide the distance by 2 another time (since vertical and horizontal resolution increases twofold with UHD)
You can also extrapolate linearly to larger screen sizes.
So if you had a distance of say 5m for a 30" screen with FHD, divide that by 2-> 2.5m, now extrapolate it to e.g. 55" -> 2.5 * 55/30 = 4.6m now divide by 2 for UHD -> 2.3
Samus - Friday, May 6, 2016 - link
I've always liked this method but on larger "4K" displays, it has me sitting too close and causes my head to move too much. The purpose of the "divide dpi" method is to set you back just far enough so your eyes do the moving, not your neck muscles.Basically my rule of thumb is follow the method and add the length of your chair seat to the distance as a buffer.
ingwe - Wednesday, May 4, 2016 - link
I stupidly read this as UHD instead of QHD and almost started for my wallet. Anyway at QHD this is just too large for my desires.Flunk - Wednesday, May 4, 2016 - link
I agree, if it ain't 4K I ain't interested.BrokenCrayons - Wednesday, May 4, 2016 - link
Until GPU technology advances a little more, I'd rather not drive all the pixels in a 4K screen. There's nothing wrong with doing so, but 1440 is probably okay for a lot of people out there and the price certainly has a lot of appeal.Flunk - Wednesday, May 4, 2016 - link
I want it for desktop uses, "GPU Technology" has been able to drive 4K at the desktop for years. I have a 15" laptop with a 4K screen already and it has no problems drawing my Visual Studio window. There is little constructive use for a huge screen like this with such a low resolution.zodiacfml - Wednesday, May 4, 2016 - link
Funny. Same here too. Freesync should be included.CoreyWat - Wednesday, May 4, 2016 - link
PPI 27in at 2560 × 1440 108ppi, 32in 91ppi (not a huge difference)BurntMyBacon - Wednesday, May 4, 2016 - link
@CoreyWar: "PPI 27in at 2560 × 1440 108ppi, 32in 91ppi (not a huge difference)"It is almost exactly the same pixel density you get from a 1080p 24" monitor. I consider this difference pretty notable and easily differentiable. However, that doesn't make it bad. It is certainly workable.
MrSpadge - Wednesday, May 4, 2016 - link
When ever I return from my 25" QHD to my 24" 1920x1200 at work I'm amazed at how large and clumsy everything looks. 92 ppi would definitely be too low for me.Guspaz - Wednesday, May 4, 2016 - link
Monoprice has a 27" QHD IPS monitor for $299.99... and a 28" 4K IPS monitor for $379.99...The only thing notable about this monitor, I guess, is that it's a 32" monitor.
Icehawk - Wednesday, May 4, 2016 - link
How about that it is made by a decent vendor? Monoprices monitors don't review all that well. Personally I am popping a chub on this, I have a qhd 27" which is awesome but I have to use scaling, a 32 would probably mean I don't need to and could move my 27" to my secondary display. I game but 60hz hasn't been a problem for me yet so while higher refresh would be nice it isn't a big deal to me.Guspaz - Wednesday, May 4, 2016 - link
Neither LG nor HP manufacture the panels, so ultimately the quality of the panel isn't determined by who puts the monitor together. As an example, my Dell U2711 uses the same identical panel as the contemporaneous Apple Cinema Display.Of course, the controller that the vendor connects up to that panel can make a rather large difference: things like gsync or freesync are purely attributes of the controller rather than the panel.
DanNeely - Wednesday, May 4, 2016 - link
LG does make some LCD panels, although I don't know if they use their own exclusively. The quality of the panel used is however determined by the OEM when they decide which panel family to buy from and which grade of panel from family to use (panels are binned for consistency and the number of dead/stuck pixels just like CPUs are binned for speed). Panel quality is one of the reasons why "Korean"/Monoprice 1440p monitors are so much cheaper than Dell/HP/etc. The former use the low bin panels normally put into digital signage.Guspaz - Wednesday, May 4, 2016 - link
Sorry, I meant to say "Neither Monoprice nor HP". I was thinking of some companies that make panels (LG, AUO, Samsung, etc) which is probably why I accidentally typed LG.bug77 - Friday, May 6, 2016 - link
The quality is most certainly determined by who puts the monitor together. The monitor is not just a panel, the electronics driving it are just as important.euskalzabe - Thursday, May 12, 2016 - link
4k, $409 dollars and not IPS...http://www.amazon.com/Monoprice-28-Inch-DisplayPor...
smilingcrow - Wednesday, May 4, 2016 - link
Shame the USB hub is only 2.0 as with more devices skimping on ports having 4 USB 3.0 ports in a monitor is handy.Impulses - Wednesday, May 4, 2016 - link
It can also cause extra interference with some wireless devices tho... Nothing that an extension for those antenna/dongles can't fit, but still.r3loaded - Wednesday, May 4, 2016 - link
Needs 4K at this size. 2560x1440 at 32 inches is just 92 PPI which is quite low, even when accounting for a larger viewing distance versus a 27 inch monitor.benjacob - Wednesday, May 4, 2016 - link
Can this work with a laptop?magreen - Tuesday, May 10, 2016 - link
Yes, if the laptop has displayport output.Wolfpup - Wednesday, May 4, 2016 - link
This looks pretty nice. I've always liked VA, but am a gigantic fan now that I've seen how TERRIBLE IPS is on a TV. I didn't realize how massively better VA's contrast is than IPS. I guess it's not something you notice on a phone or something, but on a TV (and I'm sure it would apply to monitors as well)...night and day. Try watching Lucifer on an IPS TV :-O VA there's piles of color and contrast in all those dark colors. IPS it's just a smeary black mess.Eiffel - Wednesday, May 4, 2016 - link
This new monitor seems similar to the BenQ BL3200PT I'm currently looking at (the BenQ is slightly more expensive but has a USB3 hub, a card reader, a set of speakers, VGA/DVI/HDMI/DP inputs and a few additional features).The pixel pitch is basically the same as the one of 1980x1200 24" monitors... good enough for some of the latest Eizo CG series at over £1000. Assuming this HP product is as good as the BenQ, it is a valid product in today's world.
euskalzabe - Thursday, May 12, 2016 - link
Yup, this is the one regret I have about purchasing my LG instead of a Vizio. Colors look great in lighter images, but in dark ones the contrast goes out the window, and holy hell did somebody say backlight bleed?I can't wait until HDR TVs are available for around $600, the LG will be demoted to secondary TV and the HDR will be come the main. Hopefully a VA panel with local dimming and P3 colors!
rocky12345 - Wednesday, May 4, 2016 - link
I would luv a 27-30 2560x1600 16:10 screen much better than the 16:9 format. Well not much but it is nice having that little bit of extra height in the screen for sure. My projector is 16:10 & I can assure you the extra pixel height is much better for gaming. I really notice the 16:9 make it seem like squishy view when I watch a TV show that is 16:9 format. So if they could come out with a 16:10 Panel with current gen screen tech & a handy price like this HP unit I would be all over it for sure.pixelstuff - Wednesday, May 4, 2016 - link
Whatever happened to all of those fancy touch screen desktop monitors that Windows 8 (and now Windows 10) was supposed to inspire with elaborate desktop stands that allowed it to pull down flat when needed? Aka the stand-alone monitor version of the Lenovo A720 Desktop all-in-one.Lolimaster - Thursday, May 5, 2016 - link
Because they realize that even if its shown on sci-fi movies, touchscreen on big screens are annoying when its more comfortable to use a mouse/keyboard type of pshchical input.euskalzabe - Thursday, May 12, 2016 - link
Not necessarily. I still have and use my Dell ST2220T and use the touch functionality every now and then. It's perfectly comfortable as a complement to my mouse and keyboard.bznotins - Wednesday, May 4, 2016 - link
Still rocking my Dell 3007WFP from 10 years ago. Best $1100 I have ever spent on computers. Bridged many PC upgrade cycles. No internal display conversion hardware and thus no input lag.Would love to get a 4K equivalent. Feels like the market is getting there. But this HP isn't it.
DanNeely - Thursday, May 5, 2016 - link
I've got a similar era NEC 3090. Still looks great and my biggest future proofing concern is that the lack of a displayport input might bite me for $70ish in the future (the cheaper adapters top out at 1920x1200). It's possible I might end up with an ~32" 4k display at some point in the future; but what I'm really lusting after is the 31.5" 5k panel LG's been working on for the last year and change (almost as tall as my 1600p panel and at 186dpi 2:1 fallback scaling is more reasonable). It initially leaked in January with an expected release at the end of last year; it's slipped since then with TFT Central's panel database now calling for availability in the middle of this year. OTOH until this years GPUs launch gaming at 5k's more theoretical than a practical option.The_Assimilator - Thursday, May 5, 2016 - link
It's good that 2560x1440 is becoming more prevalent after so many years of stagnation at 1920x1080, but really 4K should be the standard at anything above 27".mercutio - Friday, May 6, 2016 - link
my crt did 1920x1440 15 years ago. resolutions went down with lcd, then went down further from 1920x1200 to 1920x1080 etc. then someone had the bright idea that lots of low end users should use 1366x768.mercutio - Friday, May 6, 2016 - link
the biggest caveat i see is that it's only 60hz. the cheap korean panels do 96hz easy, and once you go back to higher refresh rates you realise how frustrating 60hz was. (used to use 85 hertz on crt)Xajel - Sunday, May 8, 2016 - link
I'm more interested in 21:9 1440p IPS displays... and affordable one... don't care about curved or not as long as it doesn't have a skyscraper price tag... I hope when these comes out HDMI 2.0 and DP 1.3/1.4 becomes the standard by then...R3MF - Saturday, May 21, 2016 - link
AMD list this monitor as freesync compliant, without LFC.