This may not be useful for normal USB C connections and may be a special implementation for future VR headsets. But I am in the same boat as you... Still on SandyBridge, still happy with the performance, but USBc and m.2 are looking like the real reasons to upgrade... if we can get some decently priced boards with 10gig Ethernet then it may push me over to upgrading.
I am at Haswell and ... I don't have any plan to upgrade it anytime soon. Personally I think it's too many change with too little gain. I think I'm going to wait for the DDR5 platform....
Anyway if the card bring USB Type-C support (with VR and may be DP alt mode ...) then it will be interesting. However this card along cost more than all major parts I need to change combined lol.
Haswell should have a few USB-3 ports unless you bought a very low end mobo.
I've also got a 3.5yo Haswell system, bought with the expectation that it'd be the core of my build for 6-8 years and so far don't see any reason expedite my plans. Unless 6+ physical cores becomes a quasi-requirement for newer games I still feel on track for an 8 year lifespan (new GPU every 2, possibly a bigger SSD toward the end of the run but I'm only at ~45% on my current 1TB so likely not). DDR5, PCIe4 (or even 5), and ~50% USB-C are my nominal platform targets for when I upgrade. I'm unlikely to hold out much past 8y even if I don't have everything I'm looking forward to though, just because I'd much prefer to do a planned upgrade during down time than a chaotic my PC just died scramble.
Don't feel bad about typing RTX 1080. Even Anton's table has RTX 1xxx GPUs listed for the reference and FE versions. It's an easy mistake to make. I've had to second guess a couple of times when I made comments already.
I dunno what it means for UHD TVs, but I'm guessing it will be less than 2 years until NVIDIA releases another generation of cards. I can't imagine them skipping 7 nm entirely.
You believe they'll be on to 5nm with sufficient defect rates for GPU sized chips in less than 2 years? You have more faith than I. Of course, I still agree that it probably won't take 2 years to get HDMI 2.1 on their cards.
HDMI 2.1 is still in early roll out. The official "Compliance Test Specification" has only started to partially roll out in August 2018, with only a partial feature set available for testing so far. Its not exactly easy (or maybe even possible) to release a HDMI 2.1 certified product yet.
Personally I'm hoping for the 2060 or so to release with HDMI 2.1 since its coming later anyway, and has a more appropriate size for my HTPC anyway.
Since Nvidia released the 2080Ti along with 2080 and 2070 the next-gen (30xx series or whatever) should be released (at 7nm) in one year tops. Due to the high cost and high TDP of the very big 2080Ti die they might even release a refresh of it, possibly branded as Titan (Xp?), at 7nm ~6 months after the 2080(Ti) release.
It would have the same number of shader, tensor and RT cores, 12 GB of DDR6 RAM, and a 384 bit memory bus. It would be quite smaller and much more power efficient, and could thus clock higher, having up to around 1.8 GHz base clock and at least a 2 GHz boost clock. Along with a .. "bargain price" of US$ 1,499.
"often with simplistic blower cooling systems" I think the word you want is simple. (Simplistic implies it is so simple it no longer agrees with reality.)
Nvidia's Founder's Edition cards still have the huge vapor chamber's the blower's had, so I imagine they'll preform QUITE a bit better than most would expect for a dual fan cooler that size, hence the OC's. (Big ol' vapor chamber + an open-air cooler = an even bigger can O' whoopass. Just ask Sapphire).
Have any of the OEMs spoken about if we'll see 'more custom' cards with DL-DVI support later on, or if it's the end of the line for old 2560x1600 displays and a lot of the first generation korean 1440p's that don't have DP or HDMI1.4 support and need DL-DVI to go above 1920x1200? (The cheap adapters don't work at all for these, the $70-100ish ones sorta do but are all rather buggy.)
Haven't seen any models with DL-DVI support yet and the reference models don't have it. I haven't heard any manufacturers explicitly say that they won't support DL-DVI going forwards, but I never heard any speak up when they were getting rid of the DE-15 connector (VGA) either. Maintaining VR support seams to be more paramount for manufacturers than supporting legacy standards so I wouldn't expect to see DL-DVI in most mainstream products. It's unfortunate as there are still a great many monitors and projectors that are still perfectly serviceable that could be served by a DVI-I port directly or through a VGA adapter. That said, it does appear that some manufacturers will be including a DP to DVI adapter in the kit. So I suppose there is still some hope left for high resolution 16:10 monitors.
I'll be shocked if any of the included dongles support better than 1920x1200x60 because that's the limit of what can be done with an idiot proof $10-15 dongle vs a $70-100 adapter that haven't been upgraded since their buggy 1.0 releases a half dozen years ago. The kind needed to support DL-DVI are roughly smartphone/cigarette pack size boxes with a USB plug for extra power, not just a pair of plugs optionally with a short cable between them.
The cheap flawless dongles are only capable of doing single link (1200p60 max, excluding WTFy non-standard modes that very few displays ever supported); that's easy to do because it required little more than 1:1 mapping of bits in the data stream. 1440/1600p requires dual link DVI which is 2 HDMI1.0 class signals in parallel, those adapters all had issues at release because they had to slice the signal in half and build 2 new ones in parallel, and sold badly enough that there never was a 2nd generation that solved the problems with the signal occasionally glitching.
To be fair, if there was so little demand for the adapters can you blame the AIBs/Nvidia for dropping support? It sounds like these monitors were fairly niche to begin with, being released before supporting modern standards. On top of that these high end cards are targeting gamers with deep pockets--many who would be affected already have likely bought new monitors, or will.
A simple solution is just to run with a 10 series card that is probably good for at least another two years at those resolutions. It sucks but legacy support has to be dropped at some point.
Nvidia should just make it official and change their name to NfleeceYah. While CPU makers are grappling with the failing of Moore's law, Nvidia proves once again that they can increase prices by 50% or so from generation to generation, and AMD's Vega cards aren't exactly cheap, either. We really need a significant third player in graphics ASAP; unfortunately, Intel will likely enter the market on the high price side, leaving the $ 500 and below market where it is now - still overpriced.
AMD would do the same thing in nvidia's situation. They DID in the early 2000s, when athlons were walking over Pentiums, and the high end athlon was a $1K CPU.
A third party would charge JUST as much. GPU making is seriously expensive, and tied down by patents.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
47 Comments
Back to Article
jeremyshaw - Thursday, August 23, 2018 - link
Nice, it's good to see the USB-C port is sticking across the AIB cards. My desktop (Sandy Bridge) may finally gain an USB-C connection :DSttm - Thursday, August 23, 2018 - link
I was speculation that that USB-C port added $50 to the cost of the cards. If that is true its rather aggravating!Sttm - Thursday, August 23, 2018 - link
I was reading speculation...An edit button would be a great 2019 Anandtech upgrade.
CaedenV - Thursday, August 23, 2018 - link
This may not be useful for normal USB C connections and may be a special implementation for future VR headsets.But I am in the same boat as you... Still on SandyBridge, still happy with the performance, but USBc and m.2 are looking like the real reasons to upgrade... if we can get some decently priced boards with 10gig Ethernet then it may push me over to upgrading.
mr_tawan - Friday, August 24, 2018 - link
I am at Haswell and ... I don't have any plan to upgrade it anytime soon. Personally I think it's too many change with too little gain. I think I'm going to wait for the DDR5 platform....Anyway if the card bring USB Type-C support (with VR and may be DP alt mode ...) then it will be interesting. However this card along cost more than all major parts I need to change combined lol.
Manch - Friday, August 24, 2018 - link
USB 3 & M2 add in cards. Upgrade complete!DanNeely - Friday, August 24, 2018 - link
Haswell should have a few USB-3 ports unless you bought a very low end mobo.I've also got a 3.5yo Haswell system, bought with the expectation that it'd be the core of my build for 6-8 years and so far don't see any reason expedite my plans. Unless 6+ physical cores becomes a quasi-requirement for newer games I still feel on track for an 8 year lifespan (new GPU every 2, possibly a bigger SSD toward the end of the run but I'm only at ~45% on my current 1TB so likely not). DDR5, PCIe4 (or even 5), and ~50% USB-C are my nominal platform targets for when I upgrade. I'm unlikely to hold out much past 8y even if I don't have everything I'm looking forward to though, just because I'd much prefer to do a planned upgrade during down time than a chaotic my PC just died scramble.
Nagorak - Sunday, August 26, 2018 - link
You can add both those things with a pcie card. Modularity for the win!Sttm - Thursday, August 23, 2018 - link
For the Wallet 3 Ultra! Its the only way to go! Death to your finances!MrSpadge - Friday, August 24, 2018 - link
*deep UT announcer voice* Spending spree!BurntMyBacon - Friday, August 24, 2018 - link
When you are unable to purchase the next card:<UT annoucer> "Wallet Breaker"
matfra - Thursday, August 23, 2018 - link
So wait, no HDMI 2.1 on the RTX 1080 ??? that means we need to wait 2 more years to get more than 60fps on our UDH TV ? :-(matfra - Thursday, August 23, 2018 - link
+1 for the edit button!UHD*
matfra - Thursday, August 23, 2018 - link
+1 for the edit button!RTX 2080
PeachNCream - Friday, August 24, 2018 - link
Don't feel bad about typing RTX 1080. Even Anton's table has RTX 1xxx GPUs listed for the reference and FE versions. It's an easy mistake to make. I've had to second guess a couple of times when I made comments already.Yojimbo - Thursday, August 23, 2018 - link
I dunno what it means for UHD TVs, but I'm guessing it will be less than 2 years until NVIDIA releases another generation of cards. I can't imagine them skipping 7 nm entirely.BurntMyBacon - Friday, August 24, 2018 - link
You believe they'll be on to 5nm with sufficient defect rates for GPU sized chips in less than 2 years? You have more faith than I. Of course, I still agree that it probably won't take 2 years to get HDMI 2.1 on their cards.Kvaern1 - Friday, August 24, 2018 - link
No, he wrote 7nm in less than 2 years.Yojimbo - Sunday, August 26, 2018 - link
"You believe they'll be on to 5nm with sufficient defect rates for GPU sized chips in less than 2 years? "No, I believe they will be onto 7nm, not 5nm. I think that they won't skip 7 nm and go right to 5 nm.
nevcairiel - Thursday, August 23, 2018 - link
HDMI 2.1 is still in early roll out. The official "Compliance Test Specification" has only started to partially roll out in August 2018, with only a partial feature set available for testing so far. Its not exactly easy (or maybe even possible) to release a HDMI 2.1 certified product yet.Personally I'm hoping for the 2060 or so to release with HDMI 2.1 since its coming later anyway, and has a more appropriate size for my HTPC anyway.
Santoval - Friday, August 24, 2018 - link
Since Nvidia released the 2080Ti along with 2080 and 2070 the next-gen (30xx series or whatever) should be released (at 7nm) in one year tops. Due to the high cost and high TDP of the very big 2080Ti die they might even release a refresh of it, possibly branded as Titan (Xp?), at 7nm ~6 months after the 2080(Ti) release.It would have the same number of shader, tensor and RT cores, 12 GB of DDR6 RAM, and a 384 bit memory bus. It would be quite smaller and much more power efficient, and could thus clock higher, having up to around 1.8 GHz base clock and at least a 2 GHz boost clock. Along with a .. "bargain price" of US$ 1,499.
PeachNCream - Friday, August 24, 2018 - link
That's a lot of speculation about specifications. What's your information source for the new 7nm card, its timeline, and specs?Alistair - Friday, August 24, 2018 - link
Probably new cards at 7nm next year. I don't think these will be 2 year cards. HDMI 2.1 in 2020 seems a bit late...Nagorak - Sunday, August 26, 2018 - link
When is the last time we had a one year cycle?cfenton - Friday, August 24, 2018 - link
You have a TV that has an input that will accept a 4K signal at over 60hz? Or do you mean this may be a problem for a future TV?Nagorak - Sunday, August 26, 2018 - link
Can't you use a DP to HDMI adapter?imaheadcase - Thursday, August 23, 2018 - link
So when do the reviews on card drop, day of release or before?Sttm - Thursday, August 23, 2018 - link
I believe I read embargo date on reviews is Sept. 14th.WorldWithoutMadness - Thursday, August 23, 2018 - link
Meh, just wait until next year, it might get massive price cut if rx600 series can fight the performance.darckhart - Thursday, August 23, 2018 - link
in the last photo, doesn't that "face" on the fan remind you of the vega design? NOW who's trying to confuse whom nvidia? /sAmandtec - Friday, August 24, 2018 - link
"often with simplistic blower cooling systems"I think the word you want is simple. (Simplistic implies it is so simple it no longer agrees with reality.)
PeachNCream - Friday, August 24, 2018 - link
NVIDIA's MSRPs no longer agree with reality so why should board partners feel constrained by reality when designing a cooling solution?prateekprakash - Friday, August 24, 2018 - link
"Again, so far only EVGA has announced two GeForce RTX 2080 graphics cards featuring a 14-phase PWM"Shouldn't it be VRM?
prateekprakash - Friday, August 24, 2018 - link
Turbo RTX2080-8G should be 2xDP 1.4Cooe - Friday, August 24, 2018 - link
Nvidia's Founder's Edition cards still have the huge vapor chamber's the blower's had, so I imagine they'll preform QUITE a bit better than most would expect for a dual fan cooler that size, hence the OC's. (Big ol' vapor chamber + an open-air cooler = an even bigger can O' whoopass. Just ask Sapphire).DanNeely - Friday, August 24, 2018 - link
Have any of the OEMs spoken about if we'll see 'more custom' cards with DL-DVI support later on, or if it's the end of the line for old 2560x1600 displays and a lot of the first generation korean 1440p's that don't have DP or HDMI1.4 support and need DL-DVI to go above 1920x1200? (The cheap adapters don't work at all for these, the $70-100ish ones sorta do but are all rather buggy.)BurntMyBacon - Friday, August 24, 2018 - link
Haven't seen any models with DL-DVI support yet and the reference models don't have it. I haven't heard any manufacturers explicitly say that they won't support DL-DVI going forwards, but I never heard any speak up when they were getting rid of the DE-15 connector (VGA) either. Maintaining VR support seams to be more paramount for manufacturers than supporting legacy standards so I wouldn't expect to see DL-DVI in most mainstream products. It's unfortunate as there are still a great many monitors and projectors that are still perfectly serviceable that could be served by a DVI-I port directly or through a VGA adapter. That said, it does appear that some manufacturers will be including a DP to DVI adapter in the kit. So I suppose there is still some hope left for high resolution 16:10 monitors.DanNeely - Friday, August 24, 2018 - link
I'll be shocked if any of the included dongles support better than 1920x1200x60 because that's the limit of what can be done with an idiot proof $10-15 dongle vs a $70-100 adapter that haven't been upgraded since their buggy 1.0 releases a half dozen years ago. The kind needed to support DL-DVI are roughly smartphone/cigarette pack size boxes with a USB plug for extra power, not just a pair of plugs optionally with a short cable between them.Ej24 - Friday, August 24, 2018 - link
DVI is a digital connection, displayport is also digital, just use DP to dvi adapter. It works flawlessly. Never had an issue.DanNeely - Friday, August 24, 2018 - link
The cheap flawless dongles are only capable of doing single link (1200p60 max, excluding WTFy non-standard modes that very few displays ever supported); that's easy to do because it required little more than 1:1 mapping of bits in the data stream. 1440/1600p requires dual link DVI which is 2 HDMI1.0 class signals in parallel, those adapters all had issues at release because they had to slice the signal in half and build 2 new ones in parallel, and sold badly enough that there never was a 2nd generation that solved the problems with the signal occasionally glitching.Nagorak - Sunday, August 26, 2018 - link
To be fair, if there was so little demand for the adapters can you blame the AIBs/Nvidia for dropping support? It sounds like these monitors were fairly niche to begin with, being released before supporting modern standards. On top of that these high end cards are targeting gamers with deep pockets--many who would be affected already have likely bought new monitors, or will.A simple solution is just to run with a 10 series card that is probably good for at least another two years at those resolutions. It sucks but legacy support has to be dropped at some point.
eastcoast_pete - Saturday, August 25, 2018 - link
Nvidia should just make it official and change their name to NfleeceYah. While CPU makers are grappling with the failing of Moore's law, Nvidia proves once again that they can increase prices by 50% or so from generation to generation, and AMD's Vega cards aren't exactly cheap, either. We really need a significant third player in graphics ASAP; unfortunately, Intel will likely enter the market on the high price side, leaving the $ 500 and below market where it is now - still overpriced.TheinsanegamerN - Monday, August 27, 2018 - link
AMD would do the same thing in nvidia's situation. They DID in the early 2000s, when athlons were walking over Pentiums, and the high end athlon was a $1K CPU.A third party would charge JUST as much. GPU making is seriously expensive, and tied down by patents.