This isn't some secret, it was at CES, every legit tech site I checked covered it. The sound card they also showed off, that has very little coverage, got a multi paragraph write up here. EVGA only brought three things to CES, was that too many for them to look at....?
This 5600XT is going to "rule 1080p" at the same price point as a 2060.......?
Maybe someone can explain a third option, I only see two. They are completely ignorant and think that the 2060KO is going to be inferior to the 1660 variants, or they got paid to ignore it. I don't think they are ignorant.
id bet, nvidia will still price it out of most peoples reach, like most of nvidia's cards are now.. or, what people have, doesnt justify the price they would need to upgrade to for the performance they would get.
i LOVE how people insist AT is payed by AMD, intel, nvidia, qualcom or even apple.. ever thing they have other things on the table already and havent been able to get to something? or they are working on it? Ben, IF you think YOU can do better with a site like this.. then by ALL MEANS, start one of your own.. and see how difficult it can be. BTW my reply to you.. was in response to your last sentence.
also, what are the specs of the KO?? i cant seem to find ANYTHING on the specs, maybe its a special card that EVGA has put out, and its no different then a 2060/2060 super ?? maybe thats why AT hasnt said anything about it....
Ugh, wake me up when someone makes a GPU that can handle 1080p in a single slot, half height form factor and uses around 30W of power to do it. Until then, color me unimpressed by the "performance" gains that have all come at a cost of increasing power demands and bloated dual+ slot coolers.
Since you're going to sleep, maybe you'll find such a GPU in your dreams? :-D
Seriously, by the time such power efficiency comes around, iGPUs will also be able to do it. Or close enough, that the market for 1080p class GPUs disappears and you won't find any compelling models - just see the state of GT *030 or RX *40 today.
If nothing else, the low-end cards will always be gimped by memory type, bandwidth or capacity. Or you'll have only 2 display connections - hopefully both digital, at least...
Not too worried about multiple displays. I typically use a laptop and am fine with a single screen and really, that can be VGA for all I care. But yes, you do make a valid point with respect to iGPUs. They are reaching that point, but the trouble is that you can't easily purchase a sub-100 USD, refurb business desktop and get a modern iGPU so some sort of discrete graphics card is necessary. The 1030 isn't all that bad, really. I've seen half height versions around and even passively cooled versions though they appear to be wider than a single slot. It's just that they perform a lot better at 1366x768 for the time being.
looking at a quadro RTX 4000 nothing stops them from making a single slot other than noise.
further more performance scales worse and worse the higher the voltage, so it wouldn't surprise me if you could have good 1080p performance around 50W if you took an RTX 2080 ti and underclocked the hell out of it, in a similar manner to how max-Q works. but just.... even more Q
The last single slot high-end video cards that I'm aware of, the 8800 GT and HD 3850, had TDPs of 125W and 105W respectively.
Midrange cards have been in this range for a while now. If anything, I think AIBs don't want to design anymore coolers than they have to, so we have massive cooler designs meant for 180W+ parts cooling devices that output much less heat.
Those single-slot cards had terrible thermal and noise characteristics, too - while the vast majority of cases that gaming GPUs go into have a surfeit of space.
Given the choice between a slender card that hits a niche market and a bigger card that reviews well for low noise and good overclocking headroom, most companies make the smart choice.
The problem is that games keep increasing in quality to go with faster GPUs.
You can play many older games at 1080p on modern half-height/single-slot cards no problem. Even at 144 Hz.
But not new games.
And those two statements are generally always true. (Well, since the first half-height/single-slot video card capable of outputting 1080p at all came out...)
I suspect that we will be seeing a substantial leap in system requirements toward the end of next year as next generation consoles are released It happens pretty much every time a console generation is released and it takes a while for the PC side of the house to catch up from a cost and performance perspective. For now it's been good times for new releases versus the affordability of the hardware available, but I think in November, I'll probably just buy whatever Playstation is available and call it quits for chasing after PC games unless they run on my existing, obsolete junk laptop. It'll probably be cheaper than playing the upgrade game.
Only for the hardware- because with the prices of games and services they more than recoupe it over their life time. That is why consoles are succesful only in richest countries, not in poor ones, where people count their money hard.
I don't see that happening. When big console changes came about (vastly different architectures, eg PS2->PS3->PS4) the porting requirements changed which lead to poorly optimised code. The current leading consoles are pretty close to PC architectures, and the next generation will be largely the same but with higher specs - but still lower than high end PCs.
Many older games I can play at 1080p with my i3570 and GTX750. Not even a GTX750ti. Newer ones, especially with the various feature knobs turned up...not a chance.
As GPUs are able to handle the demands, and especially as consoles improve, you'll find what games demand at 1080p continuing to increase.
What's with the 30W limitation? The 1650 can do everything you asked, but at 75W. That's seriously impressive from any perspective besides "it arbitrarily has to be 30W".
The 1030 was impressive in that regard when it was new, but time moves on, and neither AMD nor Nvidia have any pressing drive to replace their bottom-of-the-range GPU on a regular basis.
One consideration would be that if you try to dissipate that 75W with a single slot cooler, the GPU will either be unbearably loud, be clocked really low, or throttle (or multiple of them). And even worse clocks if you want a card with a passive cooler.
I mean ASIDE from the fact that 30W half height single slot graphics cards are in such HIGH demand from desktop gamers (nearly .00001% of users want one!)… why stop there? I think while you're asking for something that doesn't exist, you really need to lean into it. Sleep until they can do 16K full raycasting in a 5W power envelope...
Well, 1050ti/1650 is pretty good at 75 watts. No power connector needed!
Not sure why you care about single-slot coolers, maybe you have a personal reason for doing so.
If anything, GPU coolers should be a lot bigger. GPUs are 300W+. Dual 140mm towers on both CPU/GPU. Linus has a recent video about this, and he (somewhat bizarrely) underreacts to the tremendous overwhelming improvement in heat dissipation, but it's just the obvious thing to do, both for temps/boost and the longevity of the card.
Also, video cards should be mounted parallel to an ITX motherboard on a 90degree PCIe slot, it's not 1999 anymore. It's somewhat rare to see systems with more than one PCIe card in the wild.
I think they did that to prevent flashing, like a Vega 56 to 64, or a 5600XT to a 5700. Even if you can flash it, it will be limited by bandwidth and memory so it won't be that useful.
I think we're at a point where that might actually be for the aesthetics - tempered glass cases have been a big deal for so long in the gaming world that I think a lot of buyers probably want a "full-size" GPU taking up most of the length of the case, even if the size isn't thermally necessary.
270 meaning easy 350 CAD....not cool, seeing as even the 5500xt "sub" whatever they claimed it would be ends up that much higher price than should have been i.e even more $$$ than 470/570 launched at.
AMD not directly control pricing (though they should) leaves many "waiting in the dark"
here hoping it actually performs as intended (or beyond) with none of the BS bad cards "out of the box" AIB/Vendors need to work even harder to make sure this is not an issue, as many hundred $$$ purchases, especially for "HIGH TECH" should not at all have such issues, one would think they work even better than stuff from the past e.g less issues not same rinse and repeat.
goes for the red and blue or blue vs green just the same as of late, sadly (though they ofc claim it never happens, big time with blue (Intel) and green (NVDA) try sweep problems under the rug often enough refuse to pay for issues they knew full well existed but box up for sale anyways....much less with AMD (though the various AB/Vendors hold lions share of the blame for "cheaping out" or at lest not thorough test methodology just excuses)
My 7870 still runs "perfectly fine" but is in serious need of something that much faster in as much as possible, 5600XT would fit this bill 5500 not so much....shame was not a forced $279 max USD or CAD to be quite honest.....
try buying these cards in a country other then the US.. and you complain about this card being 299 ?? this card is easily 400-450cdn here... as for 199.. keep dreaming...
The slides clearly show it beating the GTX 1660 Ti as well as the GTX 1660 Super overclocked parts by a decent enough margin. And it matches their price.
1660 Ti is useless because 1660 Super give almost same performance for lower price. 1660 Ti does not have the best performance per dollar among 1660 series
Also, do not forget those benchmarks are from AMD. It is not coming from neutral source.
Even $250 seems too high when XFX 5700s have been going for $299 recently. I strongly feel/hope that these 5600XTs will be available for $199-230 within about 6 months.
Probably not, these are basically cut-down RX 5700 XT cards after all.
Overall, the entire Navi stack is only impressive compared to the initial Turing cards. Their new variants with faster performance and cheaper prices, they've really knocked the wind out of Navi's sail. And so, yes they're competitive but not great on each properties.
To make matters worse, I don't envision AMD making much of a significant improvement for 2021 or Navi+. They'll be using a +7nm node, and a slightly refined architecture... it will be something like 20%. Whereas Nvidia is poised to make a massive improvement with a new architecture and as well as using a comparative 7nm node. That way tomorrow's RX 5700XT owners, are essentially going to be like today's RX 590 owners, or yesterday's GTX 970 owners. So those impressive specs from the PS5 and the Xbox X are going to be downgraded from high-end to midrange immediately. However, game publishers will still target the Base PS4 level for a couple years, which will have an impact on PC port titles.
Maybe not. The price will likely be dropped slightly or remain the same, but the performance is expected to increase substantially. This means "performance:price" factor will go up, ie, better value for consumers. I don't expect many improvements on the architecture, it will mostly come from the lithography improvements, but they'll definitely be tweaking their Extra Cores (Ray-Tracing, NPU, etc).
Here's why the price won't be a factor: History! AMD really dropped the ball with the RX 480 launch, which initially wasn't competitive (enough) against the GTX 1060. They were floundering since the launch of the R9 290, so there was big expectations which they could not fulfil. And since then the Vega cards and Polaris refresh have really hurt them and the industry. The Radeon division was basically saved by the cryptocurrency boom from a few years ago. Yet, it gave Nvidia basically several generations of cards with practically no competition. And Nvidia was wise to capitalise on that opportunity by sidelining AMD from the race by using "Ray Tracing" to differentiate themselves. Of course this was a risk, and it didn't entirely pan out.
Despite that, the gamble Nvidia took didn't really hurt them. However, they did suffer lower revenue and profits since the release of RTX cards. So they slightly came back to value with the "Super Turing" variants. Honestly, I think this is where Nvidia will stay since testing the market. And their only reason to reduce pricing would be competition. The only reason for increasing price would be Time and Inflation.
Firstly, I don't think Intel is going to make anything substantial in the GPU market. It's a tough market and they aren't prepared for it. And they're too pre-occupied doing damage control on their reputation, and trying to fix their fab process.
Secondly, I don't think AMD will do "Proper Ray Tracing" soon. At best they'll nearly catch up to the RTX 2060, and at worst Nvidia will actually mature their technology in both hardware and software, and have it out in actual games, and ALL their cards will support it.
Thirdly, AMD is now always behind in hardware and software, compared to Nvidia. Sometimes it is a little, but sometimes it is by a lot. It all comes down to Nvidia having more workers and better workers, basically flexing their R&D division. AMD cannot match them yet, but they will catch up in two years time, as their success from Ryzen and Consoles pay dividends. So we can expect Navi+ to be mostly a dud, but their follow-up (or the one after) should actually be a substantial improvement even if it's not too impressive on a 5nm lithography. They'll do it by sheer increased R&D and development into software and architecture. Not sure what they'll call it: Alpha, Taurus, Orion, Sirius, Antares, Rigel, Canopus....? But I've no doubt it's coming, this isn't the ancient Radeon company, nor is it the olde AMD company, this is the new "Lisa Su" AMD company.
Kangal for the most part.. nvidia priced them selves out of reach for most people with the 20 series cards.. most of the people i know.. wont even look at those cards cause of the prices.. they cant justify the prices compared to what they have.. to what they would need to spend.. and.. paying their bills.. feeding their families is more important.. but the common thing seems to be.. i dont have any games that support ray tracing or will be getting any that do.. so.. ill skip this round.. for me.. my strix 1060 is fine for what i play.. so unless nvidia drops their prices or amd releases something that is as fast for less.. meh...
I'm genuinely excited (from a tech perspective) to see what Nvidia do at 7nm.
I'm also painfully aware that it'll be entirely theoretical for me, as Nvidia haven't given a rat's ass about value for money since Maxwell. As a result, I'm still running a GTX 980M...
Even if they did, chances are overwhelming it'll be PCIe 4.0 - same as the 5500 - so if/when you get a compatible platform, it'll have the same bandwidth as a x16 PCIe 3.0 slot. (Though I doubt they'd do that - the 5600 looks different enough from the 5700 that I don't think they need to be OVERLY worried about people stepping back.)
Probably depends on the VRAM chips, but for nVidia cards like the 1660S- I've seen memory to overclock that much and more, and not be limited to 12Gbps.
I've got a 580 and have been thinking the same thing. Even if you're not sticking with AMD, getting a real generational bump (where you can upgrade, spending less money and still getting better performance) really hasn't happened for that part of the market. I've got hopes for the next-gen RDNA bringing more to the table, and at the very least, it seems like AMD's put enough pressure on NVIDIA where they won't be able to take a breather for hte next gen like they've done so far...
While I'm glad AMD is filling out their product stack, can we talk about how many chips they end up creating masks for? It now REALLY looks like the Navi 14 die will only be used in one product - and I'm wondering, since the 5600 series is almost certainly based on 5700 rejects - why they didn't launch this sooner.
They shrunk both the Vega die (to 7nm for Radeon VII) and the Polaris one (for the 12nm RX 590), and created a Navi 14 die... It seems like the "throw spaghetti at the wall, see what sticks" strategy.
14's already used in two products - the 5300M and the 5500M (plus all their wacky variants).
Best guess on "why not sooner" - they've probably been waiting for enough dies with the right kind of defects, just like they did with the Instinct MI60 to make Radeon VII.
Incidentally, their strategy for Vega 20 (R VII / MI60/50) makes more sense if you consider it a way to test a new process with a known design and guarantee profits from the results, no matter how bad the yields. That might be why we haven't seen a big Navi - it's not as good for GPGPU as Vega, so if yields on a die that big are still sub-par then they wouldn't even have the option to sell the few fully-working dies as accelerator cards to boost margins.
Navi had a compute issue. It's finally fixed for RDNA+, but current gen Navi has compute errors. I don't know the specifics but you can look it up. I dont think it can be used for Datacenter uses because of this.
I think they cut down the bus width not due to yields, but to prevent flashing to a 5700. Same with the RAM. Because even if you flash it, the bandwidth should be a limiting factor.
Disappointed. Why no 8 Gb version now? I agree with Ryan and many here that any card with less than 8 GB RAM is simply not future-proof. AMD has the chance to make some serious inroads while NVIDIA is still selling off their Turing chips, of which they apparently have a lot of left over (still post-cryptomining, I guess). @Ryan: have you heard anything more about the supposed launch date for NVIDIA's Ampere GPU line? They must have started fabbing those at Samsung for a while now. AMD better gets their act together before then, or Team Red will again play catch-up.
What make you think that 6GB won't be future proof ??? YOu can always drop the setting at little
None of today games need more than 6GB when running the game on playable setting
For example, Red Dead 2 does not need more than 6GB for 1440p max setting but even 8GB card can not handle it with constant 60fps because they lack the GPU power.
I'm pretty sure that 6Gb card will not have any problem with 1080p medium-high setting for next 5 years, and you will probably run out of GPU power before memory
In that case, and having in mind that AMD cards aren't typically as efficient in using compression as NVIDIA cards are, why then have a version of the lesser 5500 card with 8 Gb? And, when I play, I quite like my details on "high"; why else would one pay for a game with good-looking scenery?
That's not the point. You are asking customers and enthusiasts to take a backwards step from the 8GB cards they bought 3 years ago.
Imagine if 10 years ago Nvidia or AMD were suggesting users bought a new 512MB card over the 1GB card they had before. Not quite the same ratio of ram but the principle is the same.
Oh yes and some of us here do not change our graphics cards like some change their underpants. I expect 3+ years out of a card so thinking ahead is kind of important.
That would be a. Interesting and b. Make sense. NVIDIA is supposedly launching Ampere (Turing replacement) in June, so clearing out the warehouse now makes sense for them. Regarding Navi cards and pricing, if a 2060 is within $20 of these 5600 cards, AMD will hopefully lower prices and card makers push for 8 Gb cards to stay competitive.
So, will either GPU maker (or any of the OEMs selling graphics cards) be releasing an updated, silent (fanless) version? Especially in the half-height format?
Seems like the Nvidia 730 was the last useful card in that format, unless you want to spend the big bucks to get the Radeon Pro WX series (which are great for desktop use, but don't have all the same gaming specs).
@Ryan Smith the GPU in your picture would only have 2048 Cores because you greyed-out 4 Dual-CU (-512) each having 2x 64 Cores. The 5700 non-XT has only 2 Dual-CU disabled (-256). 2304 Cores
$279 is not a good price for 5600XT. 5700 is $299 online right now. Hopefully it will be closer to $199 for 5600 regular. Otherwise might as well spring for 5700.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
83 Comments
Back to Article
Sub31 - Monday, January 6, 2020 - link
It's competitive, but doesn't bring much new to the table...BenSkywalker - Saturday, January 11, 2020 - link
The 2060 KO is $279, it hard launched already.Impressive silence on that one AT, hope the checks are worth the credibility hit.
Korguz - Sunday, January 12, 2020 - link
yea ok sure there ben....BenSkywalker - Tuesday, January 14, 2020 - link
Still no mention of it.https://www.newegg.com/evga-geforce-rtx-2060-06g-p...
https://www.amazon.com/EVGA-GeForce-Gaming-Backpla...
This isn't some secret, it was at CES, every legit tech site I checked covered it. The sound card they also showed off, that has very little coverage, got a multi paragraph write up here. EVGA only brought three things to CES, was that too many for them to look at....?
This 5600XT is going to "rule 1080p" at the same price point as a 2060.......?
Maybe someone can explain a third option, I only see two. They are completely ignorant and think that the 2060KO is going to be inferior to the 1660 variants, or they got paid to ignore it. I don't think they are ignorant.
Korguz - Sunday, January 19, 2020 - link
id bet, nvidia will still price it out of most peoples reach, like most of nvidia's cards are now.. or, what people have, doesnt justify the price they would need to upgrade to for the performance they would get.i LOVE how people insist AT is payed by AMD, intel, nvidia, qualcom or even apple.. ever thing they have other things on the table already and havent been able to get to something? or they are working on it? Ben, IF you think YOU can do better with a site like this.. then by ALL MEANS, start one of your own.. and see how difficult it can be. BTW my reply to you.. was in response to your last sentence.
Korguz - Sunday, January 19, 2020 - link
also, what are the specs of the KO?? i cant seem to find ANYTHING on the specs, maybe its a special card that EVGA has put out, and its no different then a 2060/2060 super ?? maybe thats why AT hasnt said anything about it....Korguz - Thursday, January 23, 2020 - link
yep just as i thought.. its just a part from evga that they gave a different nane to to seperate it from all the others...AshlayW - Sunday, January 12, 2020 - link
Wow what a tool you are.peevee - Tuesday, January 14, 2020 - link
Good catch about 2060 KO. $280 for 5600XT does not seem right, even compared to 1660 Super, and now this...PeachNCream - Monday, January 6, 2020 - link
Ugh, wake me up when someone makes a GPU that can handle 1080p in a single slot, half height form factor and uses around 30W of power to do it. Until then, color me unimpressed by the "performance" gains that have all come at a cost of increasing power demands and bloated dual+ slot coolers.Hul8 - Monday, January 6, 2020 - link
Since you're going to sleep, maybe you'll find such a GPU in your dreams? :-DSeriously, by the time such power efficiency comes around, iGPUs will also be able to do it. Or close enough, that the market for 1080p class GPUs disappears and you won't find any compelling models - just see the state of GT *030 or RX *40 today.
If nothing else, the low-end cards will always be gimped by memory type, bandwidth or capacity. Or you'll have only 2 display connections - hopefully both digital, at least...
PeachNCream - Monday, January 6, 2020 - link
Not too worried about multiple displays. I typically use a laptop and am fine with a single screen and really, that can be VGA for all I care. But yes, you do make a valid point with respect to iGPUs. They are reaching that point, but the trouble is that you can't easily purchase a sub-100 USD, refurb business desktop and get a modern iGPU so some sort of discrete graphics card is necessary. The 1030 isn't all that bad, really. I've seen half height versions around and even passively cooled versions though they appear to be wider than a single slot. It's just that they perform a lot better at 1366x768 for the time being.olde94 - Tuesday, January 7, 2020 - link
looking at a quadro RTX 4000 nothing stops them from making a single slot other than noise.further more performance scales worse and worse the higher the voltage, so it wouldn't surprise me if you could have good 1080p performance around 50W if you took an RTX 2080 ti and underclocked the hell out of it, in a similar manner to how max-Q works. but just.... even more Q
qhd - Tuesday, January 7, 2020 - link
More Q than Max-Q?? That's too Q to handle!GreenReaper - Tuesday, January 7, 2020 - link
Qinfinitely so.xenol - Tuesday, January 7, 2020 - link
The last single slot high-end video cards that I'm aware of, the 8800 GT and HD 3850, had TDPs of 125W and 105W respectively.Midrange cards have been in this range for a while now. If anything, I think AIBs don't want to design anymore coolers than they have to, so we have massive cooler designs meant for 180W+ parts cooling devices that output much less heat.
Spunjji - Tuesday, January 7, 2020 - link
Those single-slot cards had terrible thermal and noise characteristics, too - while the vast majority of cases that gaming GPUs go into have a surfeit of space.Given the choice between a slender card that hits a niche market and a bigger card that reviews well for low noise and good overclocking headroom, most companies make the smart choice.
CharonPDX - Tuesday, January 7, 2020 - link
The problem is that games keep increasing in quality to go with faster GPUs.You can play many older games at 1080p on modern half-height/single-slot cards no problem. Even at 144 Hz.
But not new games.
And those two statements are generally always true. (Well, since the first half-height/single-slot video card capable of outputting 1080p at all came out...)
PeachNCream - Tuesday, January 7, 2020 - link
I suspect that we will be seeing a substantial leap in system requirements toward the end of next year as next generation consoles are released It happens pretty much every time a console generation is released and it takes a while for the PC side of the house to catch up from a cost and performance perspective. For now it's been good times for new releases versus the affordability of the hardware available, but I think in November, I'll probably just buy whatever Playstation is available and call it quits for chasing after PC games unless they run on my existing, obsolete junk laptop. It'll probably be cheaper than playing the upgrade game.Spunjji - Tuesday, January 7, 2020 - link
From a value-for-money perspective, it's hard to beat a brand-new console.neblogai - Tuesday, January 7, 2020 - link
Only for the hardware- because with the prices of games and services they more than recoupe it over their life time. That is why consoles are succesful only in richest countries, not in poor ones, where people count their money hard.Gigaplex - Wednesday, January 8, 2020 - link
I don't see that happening. When big console changes came about (vastly different architectures, eg PS2->PS3->PS4) the porting requirements changed which lead to poorly optimised code. The current leading consoles are pretty close to PC architectures, and the next generation will be largely the same but with higher specs - but still lower than high end PCs.azazel1024 - Tuesday, January 7, 2020 - link
Many older games I can play at 1080p with my i3570 and GTX750. Not even a GTX750ti. Newer ones, especially with the various feature knobs turned up...not a chance.As GPUs are able to handle the demands, and especially as consoles improve, you'll find what games demand at 1080p continuing to increase.
Spunjji - Tuesday, January 7, 2020 - link
What's with the 30W limitation? The 1650 can do everything you asked, but at 75W. That's seriously impressive from any perspective besides "it arbitrarily has to be 30W".The 1030 was impressive in that regard when it was new, but time moves on, and neither AMD nor Nvidia have any pressing drive to replace their bottom-of-the-range GPU on a regular basis.
Hul8 - Tuesday, January 7, 2020 - link
One consideration would be that if you try to dissipate that 75W with a single slot cooler, the GPU will either be unbearably loud, be clocked really low, or throttle (or multiple of them). And even worse clocks if you want a card with a passive cooler.Hul8 - Tuesday, January 7, 2020 - link
Single slot *was* one of @PeachNCream's requirements.SirPerro - Wednesday, January 8, 2020 - link
I find interesting the "can handle 1080p" mentality. As if games 4 years from now will be the same as current ones.One must be ready to accept the fact that a card which runs current games in ultra at 60fps will not run future games in ultra.
And it's perfectly fine and normal. This is not the console world. There's no "GPU power freeze" here. Small steps in the market every few months.
flyingpants265 - Saturday, January 11, 2020 - link
IMO the time for 1080p is completely over. Games should be targetting either 1440p@240hz, or 75-90hz.Screw 60hz honestly, I have a Samsung 19" 75hz from like 2006 or something and it works great.
Alexvrb - Wednesday, January 8, 2020 - link
I mean ASIDE from the fact that 30W half height single slot graphics cards are in such HIGH demand from desktop gamers (nearly .00001% of users want one!)… why stop there? I think while you're asking for something that doesn't exist, you really need to lean into it. Sleep until they can do 16K full raycasting in a 5W power envelope...flyingpants265 - Saturday, January 11, 2020 - link
Well, 1050ti/1650 is pretty good at 75 watts. No power connector needed!Not sure why you care about single-slot coolers, maybe you have a personal reason for doing so.
If anything, GPU coolers should be a lot bigger. GPUs are 300W+. Dual 140mm towers on both CPU/GPU. Linus has a recent video about this, and he (somewhat bizarrely) underreacts to the tremendous overwhelming improvement in heat dissipation, but it's just the obvious thing to do, both for temps/boost and the longevity of the card.
Also, video cards should be mounted parallel to an ITX motherboard on a 90degree PCIe slot, it's not 1999 anymore. It's somewhat rare to see systems with more than one PCIe card in the wild.
StevoLincolnite - Monday, January 6, 2020 - link
Interesting that AMD has gone with an oddball VRAM amount (6GB) and a 192-bit bus, something you don't see them do very often.Fataliity - Tuesday, January 7, 2020 - link
I think they did that to prevent flashing, like a Vega 56 to 64, or a 5600XT to a 5700. Even if you can flash it, it will be limited by bandwidth and memory so it won't be that useful.Hul8 - Wednesday, January 8, 2020 - link
Also harvesting dies with faults in one quarter of the memory controller.catavalon21 - Monday, January 6, 2020 - link
I am amazed seeing 3 fans to cool a 150 watt card.sing_electric - Tuesday, January 7, 2020 - link
I think we're at a point where that might actually be for the aesthetics - tempered glass cases have been a big deal for so long in the gaming world that I think a lot of buyers probably want a "full-size" GPU taking up most of the length of the case, even if the size isn't thermally necessary.Dragonstongue - Monday, January 6, 2020 - link
270 meaning easy 350 CAD....not cool, seeing as even the 5500xt "sub" whatever they claimed it would be ends up that much higher price than should have been i.e even more $$$ than 470/570 launched at.AMD not directly control pricing (though they should) leaves many "waiting in the dark"
here hoping it actually performs as intended (or beyond) with none of the BS bad cards "out of the box" AIB/Vendors need to work even harder to make sure this is not an issue, as many hundred $$$ purchases, especially for "HIGH TECH" should not at all have such issues, one would think they work even better than stuff from the past e.g less issues not same rinse and repeat.
goes for the red and blue or blue vs green just the same as of late, sadly (though they ofc claim it never happens, big time with blue (Intel) and green (NVDA) try sweep problems under the rug often enough refuse to pay for issues they knew full well existed but box up for sale anyways....much less with AMD (though the various AB/Vendors hold lions share of the blame for "cheaping out" or at lest not thorough test methodology just excuses)
My 7870 still runs "perfectly fine" but is in serious need of something that much faster in as much as possible, 5600XT would fit this bill 5500 not so much....shame was not a forced $279 max USD or CAD to be quite honest.....
flyingpants265 - Saturday, January 11, 2020 - link
5700 is $299 USD right now. I wouldn't even bother going with a 5600 unless it was $199ishKorguz - Sunday, January 12, 2020 - link
try buying these cards in a country other then the US.. and you complain about this card being 299 ?? this card is easily 400-450cdn here... as for 199.. keep dreaming...flyingpants265 - Monday, January 13, 2020 - link
Do you know about conversion rates?maroon1 - Monday, January 6, 2020 - link
It is not much faster than GTX 1660 Super while costing morevoodooboy - Tuesday, January 7, 2020 - link
Learn to read.The slides clearly show it beating the GTX 1660 Ti as well as the GTX 1660 Super overclocked parts by a decent enough margin. And it matches their price.
You’re welcome.
maroon1 - Tuesday, January 7, 2020 - link
1660 Ti is useless because 1660 Super give almost same performance for lower price. 1660 Ti does not have the best performance per dollar among 1660 seriesAlso, do not forget those benchmarks are from AMD. It is not coming from neutral source.
Irata - Tuesday, January 7, 2020 - link
If you look at the benchmarks AMD published, it's vs. a GTX 1660 Super *OC*.Looking at Newegg prices, these are noticeably more expensive than plain vanilla 1660 Super, so the RX 5600XT will probably end up costing the same.
flyingpants265 - Monday, January 13, 2020 - link
Even $250 seems too high when XFX 5700s have been going for $299 recently. I strongly feel/hope that these 5600XTs will be available for $199-230 within about 6 months.Rudde - Tuesday, January 7, 2020 - link
Does it have 36 CUs as on the table or 32 CUs as in the picture? I would think the latter.Rudde - Tuesday, January 7, 2020 - link
One of the slides in the slide deck is broken.neblogai - Tuesday, January 7, 2020 - link
36CUs in 5600XT, 32CUs in 5600 (might be OEM or select markets).https://www.amd.com/en/products/specifications/gra...
shabby - Tuesday, January 7, 2020 - link
Is amd gimping this card with an 8x pcie interface like the rx 5500?Kangal - Tuesday, January 7, 2020 - link
Probably not, these are basically cut-down RX 5700 XT cards after all.Overall, the entire Navi stack is only impressive compared to the initial Turing cards. Their new variants with faster performance and cheaper prices, they've really knocked the wind out of Navi's sail. And so, yes they're competitive but not great on each properties.
To make matters worse, I don't envision AMD making much of a significant improvement for 2021 or Navi+. They'll be using a +7nm node, and a slightly refined architecture... it will be something like 20%. Whereas Nvidia is poised to make a massive improvement with a new architecture and as well as using a comparative 7nm node. That way tomorrow's RX 5700XT owners, are essentially going to be like today's RX 590 owners, or yesterday's GTX 970 owners. So those impressive specs from the PS5 and the Xbox X are going to be downgraded from high-end to midrange immediately. However, game publishers will still target the Base PS4 level for a couple years, which will have an impact on PC port titles.
Korguz - Tuesday, January 7, 2020 - link
and probably a massive price increase to go with those new nvidia cards....Kangal - Wednesday, January 8, 2020 - link
Maybe not.The price will likely be dropped slightly or remain the same, but the performance is expected to increase substantially. This means "performance:price" factor will go up, ie, better value for consumers. I don't expect many improvements on the architecture, it will mostly come from the lithography improvements, but they'll definitely be tweaking their Extra Cores (Ray-Tracing, NPU, etc).
Here's why the price won't be a factor:
History! AMD really dropped the ball with the RX 480 launch, which initially wasn't competitive (enough) against the GTX 1060. They were floundering since the launch of the R9 290, so there was big expectations which they could not fulfil. And since then the Vega cards and Polaris refresh have really hurt them and the industry. The Radeon division was basically saved by the cryptocurrency boom from a few years ago. Yet, it gave Nvidia basically several generations of cards with practically no competition. And Nvidia was wise to capitalise on that opportunity by sidelining AMD from the race by using "Ray Tracing" to differentiate themselves. Of course this was a risk, and it didn't entirely pan out.
Despite that, the gamble Nvidia took didn't really hurt them. However, they did suffer lower revenue and profits since the release of RTX cards. So they slightly came back to value with the "Super Turing" variants. Honestly, I think this is where Nvidia will stay since testing the market. And their only reason to reduce pricing would be competition. The only reason for increasing price would be Time and Inflation.
Firstly, I don't think Intel is going to make anything substantial in the GPU market. It's a tough market and they aren't prepared for it. And they're too pre-occupied doing damage control on their reputation, and trying to fix their fab process.
Secondly, I don't think AMD will do "Proper Ray Tracing" soon. At best they'll nearly catch up to the RTX 2060, and at worst Nvidia will actually mature their technology in both hardware and software, and have it out in actual games, and ALL their cards will support it.
Thirdly, AMD is now always behind in hardware and software, compared to Nvidia. Sometimes it is a little, but sometimes it is by a lot. It all comes down to Nvidia having more workers and better workers, basically flexing their R&D division. AMD cannot match them yet, but they will catch up in two years time, as their success from Ryzen and Consoles pay dividends. So we can expect Navi+ to be mostly a dud, but their follow-up (or the one after) should actually be a substantial improvement even if it's not too impressive on a 5nm lithography. They'll do it by sheer increased R&D and development into software and architecture. Not sure what they'll call it: Alpha, Taurus, Orion, Sirius, Antares, Rigel, Canopus....? But I've no doubt it's coming, this isn't the ancient Radeon company, nor is it the olde AMD company, this is the new "Lisa Su" AMD company.
Korguz - Wednesday, January 8, 2020 - link
Kangal for the most part.. nvidia priced them selves out of reach for most people with the 20 series cards.. most of the people i know.. wont even look at those cards cause of the prices.. they cant justify the prices compared to what they have.. to what they would need to spend.. and.. paying their bills.. feeding their families is more important.. but the common thing seems to be.. i dont have any games that support ray tracing or will be getting any that do.. so.. ill skip this round.. for me.. my strix 1060 is fine for what i play.. so unless nvidia drops their prices or amd releases something that is as fast for less.. meh...Spunjji - Tuesday, January 7, 2020 - link
I'm genuinely excited (from a tech perspective) to see what Nvidia do at 7nm.I'm also painfully aware that it'll be entirely theoretical for me, as Nvidia haven't given a rat's ass about value for money since Maxwell. As a result, I'm still running a GTX 980M...
maroon1 - Tuesday, January 7, 2020 - link
my old laptop has 970M and the new one is RTX 2060 (not QMAX)Day and night difference between the two
I don't now know how much upgrade you get from 980M, but from 970M and lower, Turing is huge upgrade
sing_electric - Tuesday, January 7, 2020 - link
Even if they did, chances are overwhelming it'll be PCIe 4.0 - same as the 5500 - so if/when you get a compatible platform, it'll have the same bandwidth as a x16 PCIe 3.0 slot. (Though I doubt they'd do that - the 5600 looks different enough from the 5700 that I don't think they need to be OVERLY worried about people stepping back.)Spunjji - Tuesday, January 7, 2020 - link
The RX 5500 isn't "gimped" by that interface. You'd get no more performance from more lanes.TheinsanegamerN - Tuesday, January 7, 2020 - link
Ahem.https://www.neowin.net/news/pcie-30-could-be-cripp...
oin bandwidth heavy games, that x8 interface makes a very big differnece. The 5500 line is gimped on non AMD platforms.
Korguz - Thursday, January 9, 2020 - link
um.... try non pcie 4 platforms...PVG - Tuesday, January 7, 2020 - link
How well does the 12Gbps GDDR6 overclock?If it can reach ~14Gbps easily enough, this card can become a little monster...
neblogai - Tuesday, January 7, 2020 - link
Probably depends on the VRAM chips, but for nVidia cards like the 1660S- I've seen memory to overclock that much and more, and not be limited to 12Gbps.jabber - Tuesday, January 7, 2020 - link
I'll be keeping my old RX480 for another 6 months at least.C'mon AMD you've had at least 18 months or more to tempt me and all you've given me are warmed up left-overs to look at.
sing_electric - Tuesday, January 7, 2020 - link
I've got a 580 and have been thinking the same thing. Even if you're not sticking with AMD, getting a real generational bump (where you can upgrade, spending less money and still getting better performance) really hasn't happened for that part of the market. I've got hopes for the next-gen RDNA bringing more to the table, and at the very least, it seems like AMD's put enough pressure on NVIDIA where they won't be able to take a breather for hte next gen like they've done so far...jabber - Tuesday, January 7, 2020 - link
Yeah the RX480 was a real "Wow that's great 1080p performance for a good price!" card. We hadn't really had one for a few years.I'm waiting for the next version of that, where it's quite clear it gives a decent leap for a good price.
The 5700/5600 just seem a bit overpriced and underwhelming. That and I don't fancy going back to 6GB after having 8GB for three years.
flyingpants265 - Saturday, January 11, 2020 - link
RX 480 was first released around June 2016. And remember, we've been holding onto our GTX1080 for ages as well.Things haven't been moving very fast in the GPU market lately..
1600AF/16GB/5700 seems to be the best value combo right now, you can build a 1440p gaming system for $650, (even less if using old case/PSU/SSD etc.)
Spunjji - Tuesday, January 7, 2020 - link
The RX 5700 series aren't exactly "warmed up leftovers". They're solid offerings for the money - no more exciting than that for sure, but still...jabber - Wednesday, January 8, 2020 - link
But not a great enough leap for those of us with 480/580/590 cards which this is meant to replace.Plus a backwards step to 6GB...
Warmed up...
Ironchef3500 - Tuesday, January 7, 2020 - link
How many "ultimate" 1080p cards do we need....The_Assimilator - Tuesday, January 7, 2020 - link
> with the only difference is 15% lower average clockspeedsHi ho proofreading, AWAY!
Lee.Danny - Tuesday, January 7, 2020 - link
I would like to see a comparison with the Vega 56 when it releasessing_electric - Tuesday, January 7, 2020 - link
While I'm glad AMD is filling out their product stack, can we talk about how many chips they end up creating masks for? It now REALLY looks like the Navi 14 die will only be used in one product - and I'm wondering, since the 5600 series is almost certainly based on 5700 rejects - why they didn't launch this sooner.They shrunk both the Vega die (to 7nm for Radeon VII) and the Polaris one (for the 12nm RX 590), and created a Navi 14 die... It seems like the "throw spaghetti at the wall, see what sticks" strategy.
Spunjji - Tuesday, January 7, 2020 - link
14's already used in two products - the 5300M and the 5500M (plus all their wacky variants).Best guess on "why not sooner" - they've probably been waiting for enough dies with the right kind of defects, just like they did with the Instinct MI60 to make Radeon VII.
Incidentally, their strategy for Vega 20 (R VII / MI60/50) makes more sense if you consider it a way to test a new process with a known design and guarantee profits from the results, no matter how bad the yields. That might be why we haven't seen a big Navi - it's not as good for GPGPU as Vega, so if yields on a die that big are still sub-par then they wouldn't even have the option to sell the few fully-working dies as accelerator cards to boost margins.
Fataliity - Tuesday, January 7, 2020 - link
Navi had a compute issue. It's finally fixed for RDNA+, but current gen Navi has compute errors. I don't know the specifics but you can look it up. I dont think it can be used for Datacenter uses because of this.Fataliity - Tuesday, January 7, 2020 - link
I think they cut down the bus width not due to yields, but to prevent flashing to a 5700. Same with the RAM. Because even if you flash it, the bandwidth should be a limiting factor.eastcoast_pete - Tuesday, January 7, 2020 - link
Disappointed. Why no 8 Gb version now? I agree with Ryan and many here that any card with less than 8 GB RAM is simply not future-proof. AMD has the chance to make some serious inroads while NVIDIA is still selling off their Turing chips, of which they apparently have a lot of left over (still post-cryptomining, I guess). @Ryan: have you heard anything more about the supposed launch date for NVIDIA's Ampere GPU line? They must have started fabbing those at Samsung for a while now. AMD better gets their act together before then, or Team Red will again play catch-up.maroon1 - Wednesday, January 8, 2020 - link
What make you think that 6GB won't be future proof ??? YOu can always drop the setting at littleNone of today games need more than 6GB when running the game on playable setting
For example, Red Dead 2 does not need more than 6GB for 1440p max setting but even 8GB card can not handle it with constant 60fps because they lack the GPU power.
I'm pretty sure that 6Gb card will not have any problem with 1080p medium-high setting for next 5 years, and you will probably run out of GPU power before memory
eastcoast_pete - Wednesday, January 8, 2020 - link
In that case, and having in mind that AMD cards aren't typically as efficient in using compression as NVIDIA cards are, why then have a version of the lesser 5500 card with 8 Gb? And, when I play, I quite like my details on "high"; why else would one pay for a game with good-looking scenery?jabber - Thursday, January 9, 2020 - link
That's not the point. You are asking customers and enthusiasts to take a backwards step from the 8GB cards they bought 3 years ago.Imagine if 10 years ago Nvidia or AMD were suggesting users bought a new 512MB card over the 1GB card they had before. Not quite the same ratio of ram but the principle is the same.
I've got 8GB now so I'm only going forward.
jabber - Thursday, January 9, 2020 - link
Oh yes and some of us here do not change our graphics cards like some change their underpants. I expect 3+ years out of a card so thinking ahead is kind of important.maroon1 - Wednesday, January 8, 2020 - link
I read online than nivdia planning to drop the price of RTX 2060 6Gb to 299 dollarseastcoast_pete - Wednesday, January 8, 2020 - link
That would be a. Interesting and b. Make sense. NVIDIA is supposedly launching Ampere (Turing replacement) in June, so clearing out the warehouse now makes sense for them. Regarding Navi cards and pricing, if a 2060 is within $20 of these 5600 cards, AMD will hopefully lower prices and card makers push for 8 Gb cards to stay competitive.phoenix_rizzen - Thursday, January 9, 2020 - link
So, will either GPU maker (or any of the OEMs selling graphics cards) be releasing an updated, silent (fanless) version? Especially in the half-height format?Seems like the Nvidia 730 was the last useful card in that format, unless you want to spend the big bucks to get the Radeon Pro WX series (which are great for desktop use, but don't have all the same gaming specs).
_Flare - Thursday, January 9, 2020 - link
@Ryan Smiththe GPU in your picture would only have 2048 Cores because you greyed-out 4 Dual-CU (-512) each having 2x 64 Cores. The 5700 non-XT has only 2 Dual-CU disabled (-256). 2304 Cores
flyingpants265 - Saturday, January 11, 2020 - link
$279 is not a good price for 5600XT. 5700 is $299 online right now. Hopefully it will be closer to $199 for 5600 regular. Otherwise might as well spring for 5700.