Comments Locked

21 Comments

Back to Article

  • p1esk - Thursday, April 11, 2019 - link

    It should only need two DP1.4 cables, not four. Or a single HDMI 2.1 cable.
  • drgigolo - Friday, April 12, 2019 - link

    No? This is 120Hz. Double the bandwidth of 8K @ 60Hz.
  • GreenReaper - Thursday, April 11, 2019 - link

    I could manage two cables! But HDMI 2.1 with Display Stream Compression should do it in one:
    https://en.wikipedia.org/wiki/HDMI#Version_2.1

    Meanwhile DisplayPort 1.4a only support 8k in 30-bit@60Hz, so you'd need two for the whole screen - although 60Hz isn't *that* bad for most purposes, particularly if combined with Freesync 2.

    Whether anything can output that for anything beyond fixed-function video is another matter. But I like to think Navi might, since a derivative looks to be the basis for the next console generation.
  • repoman27 - Thursday, April 11, 2019 - link

    The total bandwidth including overhead for 7680 x 4320, 10 bpc, 120 Hz works out to 127.75 Gbit/s, which just squeaks under the HDMI 2.1 limit if you can get a full 3:1 ratio out of DSC, otherwise you're looking at chroma subsampling. There aren't currently any HDMI 2.1 FRL capable sources on the market though.

    NVIDIA RTX cards should be able to drive this display with two DisplayPort cables using both HBR3 and DSC. Meanwhile AMD GPUs with HBR3 support and 6 DisplayPort outputs but no DSC could probably manage with 6 cables.
  • GreenReaper - Thursday, April 11, 2019 - link

    Maybe if the scuttlebutt about there being a version of DisplayPort 2.0 that runs at 20Gbps is true, they can fit it in with lighter compression:
    https://e2e.ti.com/support/interface/f/138/t/77199...

    Of course even if it is, we may not see anything supporting it 'til 2020.
  • DanNeely - Friday, April 12, 2019 - link

    They officially announced that v.next would support uncompressed 8k60 as CES. No mention of if that was at 8 or 10 bit/color (HDR); so my cynic-sense is going to predict that will still need DSC.

    https://www.forbes.com/sites/moorinsights/2019/02/...
  • repoman27 - Sunday, April 14, 2019 - link

    VESA stated that DP next generation would more than double DP 1.4a capabilities, and would reduce protocol overhead. So if they simply double the signaling rate to 16.2 Gbit/s per lane and switch to a more efficient encoding scheme such as 64b/66b, that would provide 62.836 Gbit/s of bandwidth for a 4-lane main link. That's enough to (just) accommodate 7680 x 4320, 10 bpc, 60 Hz without chroma subsampling or DSC.

    However, it still falls short of fully meeting Rec. 2020 without compression, which calls for 12 bpc at that resolution and frame rate. If VESA were to adopt Thunderbolt 3 signaling and cables for DP next gen (20.625 Gbit/s with 64b/66b encoding) and run all four lanes in the same direction, we'd get an 80 Gbit/s link. That would be able to handle up to 8192 x 4320, 12 bpc, 60 Hz (RED 8K Full Format) losslessly, and up to 8192 x 5120, 12 bpc, 144 Hz (16:10, high refresh rate) with DSC. So 20 Gbit/s DP makes a lot of sense.
  • Hxx - Thursday, April 11, 2019 - link

    nice. my next monitor in 2025 when they actually will be affordable
  • c4v3man - Thursday, April 11, 2019 - link

    The technology landscape is littered with designs that seem excessive at first, but eventually become outdated and underwhelming. Things like high-resolution VGA displays on your phone, gigabyes of memory, 1TB hard drives...

    I don't see why you'd need an 8K 120Hz desktop display... 8K, sure I can understand for photo manipulation, high end video creation, CAD, and the security/surveillance markets. I can understand why you'd want 120hz or higher, for gaming purposes. And I can see why you'd want an 8k 120hz headset for virtual reality use.

    But I feel making an 8K60 desktop display would be far more commercially viable, and lessen the cabling headache. Or it'd make more sense if it was maybe 40-50", where it could be used to replace 4x 4K monitors as a "borderless tiled multi-display" setup.
  • Lakados - Thursday, April 11, 2019 - link

    8K@60 may seem more commercially viable but the reality is the panels they are having to use to make this work are running at 120hz or better by their very nature so the costs associated with setting up a specific screen production run would probably greatly increase the productions costs for no actual benefit other than to offer a lower refresh rate.
  • inighthawki - Thursday, April 11, 2019 - link

    High refresh rate displays are nice to have beyond just gaming. Even for basic office work, the desktop becomes so smooth having 120+hz that I will personally never buy a 60hz display again. Simply moving the mouse cursor around feels sluggish by comparison. If it were my decision, 120hz would be the new standard and bare minimum for all new displays.
  • Ironchef3500 - Friday, April 12, 2019 - link

    +1 120hz for even every day tasks is sooo much nicer.
  • ingwe - Thursday, April 11, 2019 - link

    If you do both you may reach a wider audience. For instance someone might want to game at 120Hz at a lower resolution but also want to use 8K while writing code or something. So that person might be interested in one solution. Just a thought mostly because that is the way I am. Though I have yet to make the jump to 4K yet.
  • ken.c - Thursday, April 11, 2019 - link

    Give me this in a 40" monitor and we are talking. I run a 4K 40" Philips at work (instead of a pair of 27s or 30s) and I would love to have it pixel doubled.
  • skunkworker - Friday, April 12, 2019 - link

    This, I currently run 3 24" 1920x1200 monitors in portrait mode. I would love a proper 8k screen that covers a similar physical space but with the doubled resolution. I've been eying a 43" 4k but 8k would be great for text/web with 4k for gaming.
  • teamet - Thursday, April 11, 2019 - link

    4K, 120 Hz, decent HDR, ~30", less than $1000 and I'm a happy camper
  • Kevin G - Thursday, April 11, 2019 - link

    Sharp could just cheat simply put the GPU into the display which connects to the display via Thunderbolt. The bandwidth requires to send commands to the display to process an image these resolutions is far less than the amount of bandwidth used by the raw pixel count.

    Alternative means would to use some AV-over-IP protocol and 100 Gbit Ethernet would do 7680 x 4320 @ 120 Hz 8 bpc without compression.

    Otherwise DP 2.0 with compression (which is in draft state), or HDMI 2.1 with compression could also do this over a single cable.
  • thomasg - Saturday, April 13, 2019 - link

    GPUs are attached to systems over PCIe.
    Even though it can be argued, that often 8 PCIe lanes are enough and not 16 as used by PEG, for PCIe 3.0 x8 this means a net transfer rate of roughly 63 gigabits per second - and in contrast to HDMI or DisplayPort this needs to be duplex.

    So it's quite nonsensical, that the connection between PC and external GPU could be handled by a bus achieving less than the 32 Gbit/s for Displayport or 48 Gbit/s HDMI simplex transmissions.

    There's a reason why duplex Thunderbolt 3 at roughly 30 Gbit/s (PCIe 3.0 x4) carries a performance penalty.
  • Kevin G - Monday, April 15, 2019 - link

    You are thinking in terms of pixels, not raw command rate that is necessary to send to the GPU. It is less bandwidth intensive to send the commands used to generate an image than the high resolution bitmap. A bit over simplified, but imagine sending an image of a circle on the screen. At 8K 10 bit per channel, to display the full screen you'd need to transfer ~124 Mbyte per frame where as a draw command could need less than a kilobyte in terms of parameters: the center, inner radius, outer radius, color for the inner and outer, aliasing info.

    And yes, there is plenty of stuff in a normal workload that cannot be described by simple commands but GPUs have multi-gigabyte buffers on board to cache such things. Such raster data would only need to be sent once over the Thunderbolt link while being used (unless of course the GPU's onboard memory is not enough which performance would tank and this performance drop would still occur in 16x PCIe 3.0 slot too).

    And yes, native 16x PCIe 3.0 is faster that Thunderbolt 3.0, especially at lower resolutions where there is CPU bottleneck. The GPU is essentially waiting on data from the CPU and the Thunderbolt 3.0 link can indeed be a bottleneck in this low resolution scenario. However, at 8K, we are far from the GPU waiting around for commands to be sent from the CPU. For this use-case, Thunderbolt 3.0 would be 'good enough'.
  • thomasg - Saturday, April 13, 2019 - link

    Also, it should be noted, that Thunderbolt 3 cables may not exceed 0.5 meters (or 20 inches) to carry 40 Gbit/s linerate (~30 net rate), which makes it unacceptable for a monitor.
    Active cables may be longer, but are expensive and carry a latency penalty.
  • Kevin G - Monday, April 15, 2019 - link

    You can get active copper cables up to 2 meter without issue and works for most end users. The latency is in consequential and something the entire industry is starting to adopt in principle: long run HDMI 2.1 or DP 1.3 cables are going to need to be active for similar distances. This is just the nature of high speed cabling.

    Optical cables for Thunderbolt 3.0 exist and up to 60 m though they are indeed pricy. Similarly optical active cables exist for HDMI 2.0 and DP 1.3 which also carry a premium over their copper counter parts.

Log in

Don't have an account? Sign up now