Comments Locked

77 Comments

Back to Article

  • gijames1225 - Tuesday, January 4, 2022 - link

    So with these supporting USB 4, do we finally have mass Thunderbolt support on AMD notebooks now?

    Either way, really exciting set of technologies, and I hope to get a ROG Zephyrus 14 or 15 later in the year.
  • Tams80 - Tuesday, January 4, 2022 - link

    I'm hoping for an updated Flow device. All AMD, including in the eGPU dock would be nice.

    But it seems Asus is only interested in an Intel tablet in that range this year.
  • nandnandnand - Tuesday, January 4, 2022 - link

    I think it's technically optional to have Thunderbolt 3/4 with USB 4, but likely: https://en.wikipedia.org/wiki/USB4
  • Dahak - Tuesday, January 4, 2022 - link

    argg... ffs... if its usb4 it should have all of TB3, none of this semi support.
    I miss the days when a spec meant all features. none of this crap where oh its usb4 but only support usb 3.2 data and nothing else

    Same with the new HDMI 2.1/a spec ... oh the tv is labeled with the 2.1 spec but only supports the 2.0 features..... argg...

    sorry for the mini rant
  • anonymfus - Tuesday, January 4, 2022 - link

    I think you are misunderstanding USB4. It's inspired by Thunderbolt and has all features of Thunderbolt, but it is not Thunderbolt, and the USB4 specification only mandates TB3 backward compatibility for hubs, not for hosts/devices, and otherwise threats TB3 as a deprecated predecessor.
  • futurepastnow - Friday, January 7, 2022 - link

    Oh of course it's optional. Why would the USB-IF try to make this easy?
  • blanarahul - Tuesday, January 4, 2022 - link

    “ AMD has also increased from PCIe 3.0 to PCIe 4.0, supporting 8x for a discrete GPU.”

    Really? Only 8x? They do realize that laptop GPUs have to send the frame buffer back to CPU right?
  • hecksagon - Tuesday, January 4, 2022 - link

    A frame of 4k with 32 bits per pixel is only 33 MB. At 120hz that's under 4 GB/s, well under the 16 GB/s that 8 lanes of PCIe Gen 4 give you.
  • Billy Tallis - Tuesday, January 4, 2022 - link

    And PCIe is full-duplex, so sending data back to the CPU doesn't subtract from the available throughput for transfers from CPU to GPU.
  • Spunjji - Wednesday, January 5, 2022 - link

    That wasn't a problem with PCIe 3.0 16x, why would it become one now?
  • dalanamurr - Thursday, January 13, 2022 - link

    great article, thanks!
  • zamroni - Wednesday, January 19, 2022 - link

    8 lanes of pcie4 = 16 lanes pcie3
  • zamroni - Wednesday, January 19, 2022 - link

    it depends whether the display is connected to egpu or igpu. newer laptop has mux to auto switch the display connection
  • PotatoPC_Owner - Saturday, February 12, 2022 - link

    It should technically be very much sufficient considering GPUs don't saturate PCIe completely even now. Unless its a top-tier desktop GPU, which of course these won't be paired with, it'd be completely fine seeing as there wasn't any problem with PCIe 3.0 with a x16 link which this is equivalent to.
  • Jorgp2 - Tuesday, January 4, 2022 - link

    AMD doesn't give out public information on what their CPUs actually support.
  • Xajel - Wednesday, January 5, 2022 - link

    Sadly, both of G14/G15 were upgraded but without USB4 support.

    Anyway, PCIe 3.0 tunneling in USB 4 is optional. So not all of USB 4.0 devices will support it. And AFAIK USB 4.0 Thunderbolt 3 implementation is not fully compatible with Thunderbolt 3 spec. So even if a laptop/PC came with a fully featured USB 4.0 implementation, current TB3 devices (docks, external GPU's) might not work with USB 4 mode..
  • PotatoPC_Owner - Saturday, February 12, 2022 - link

    That sucks. I was holding off on buying the G14 just coz it didnt have a webcam. But eh, I suppose I could live without USB 4.
  • Xajel - Sunday, January 9, 2022 - link

    Sadly, ASUS gaming laptops with AMD lacks USB 4. Yeap, both G14 & G15.
  • zamroni - Wednesday, January 19, 2022 - link

    per usb 4 spec, usb4 host (pc) is not mandatory to support thunderbolt 3
  • uefi - Tuesday, January 4, 2022 - link

    Pro consumer move by AMD bringing us the first on chip DRM.
  • nandnandnand - Tuesday, January 4, 2022 - link

    FUD. It will do nothing to stop piracy or anything.
  • at_clucks - Tuesday, January 4, 2022 - link

    Every machine that would have shipped with this CPU comes with a TPM doing much the same work. Now it's no longer a discrete component but integrated into the CPU. Probably Pluton can do a bit more than the TPM but at its most basic it emulates TPM, things like SHACK.

    But in the end Pluton doesn't take away anything that wasn't already taken away by OEMs working closely with MS to make sure you can't boot a Linux on your machine "for security".
  • bernstein - Tuesday, January 4, 2022 - link

    that's just FUD, you can put linux on any laptop.
  • at_clucks - Wednesday, January 5, 2022 - link

    You can but only if the distribution works with Secure Boot or after you disable it. You may find the odd package not installing because it's not signed, go through the enrollment process... But what I meant is that the mechanism is already there and if you rely on workarounds to use it then you're one step away from the moment when, in the interest of security, saving the children, and fighting the terrorists, it is locked down further. Microsoft has an outsized influence on the ecosystem and OEMs tend to rather go with their word.

    Pluton may also have workarounds and exploits but having to rely on that to do what you wanted to do is not a sustainable solution.

    MS is trying to use the Xbox model on the PC, or at least follows in Apple's T-chip shaped footsteps. And while it may make sense on a console it certainly doesn't on a PC. The Xbox is a single purpose (gaming online without piracy or cheating) machine and it fits the purpose because it's locked down. The PC is a general purpose machine so anything that can obscure or prevent full access for you, the owner, is an erosion of that concept.But people now are more used to the mobile world where you get a completely locked down slab of hardware that's effectively obsolete after 3 years because between locked down bootloaders, unavailability of critical SoC drivers, and mechanisms used by the OS/app ecosystem to make sure you run an official ROM you will never have control over that device or be able to extract the full value of that device once the manufacturer decides you need to change it. So this makes people who grew up with this model ok with losing control over what they can execute on any of their machines.
  • ballsystemlord - Wednesday, January 5, 2022 - link

    It might get that bad, but at first it's more likely that MS will use it to mine your PC for interesting data they can sell.
    The NSA will not be far behind. Pluton is just too good for them to ignore the opportunity.
  • at_clucks - Sunday, January 9, 2022 - link

    You got it bad. MS already controls your OS, and most installations probably run on machines with TPM (Win11 requires it, Win 10 doesn't but can use it if it's there and it has ~3 years to live) so for all intents and purposes it's almost as if they had Pluton. So why the hell would they need Pluton to get anything from your machine? And how would a TPM like chip which *holds secrets* help them "mine your PC for interesting data"?
  • ballsystemlord - Monday, January 10, 2022 - link

    In order:
    1: You don't have to use windowz just because it's the default OS.
    2: Because now they know all that is secret and have access to the PC remotely.

    Go to the semiaccurate link, but be warned, Pluton is a black box so we only know the most basic stuff about it.
  • Qasar - Tuesday, January 11, 2022 - link

    i hope MS starts to also sell MS branded tin foil hats soon.
  • nandnandnand - Tuesday, January 4, 2022 - link

    The big disappointment is that the 6-core Rembrandt chips only have 6 CUs. Particularly the 6600U which would be less likely to be paired with discrete graphics.

    The CPU performance increase is almost too good to be true. Needs more context. They said +11% in Cinebench which could be a combination of clocks and IPC.
  • AhsanX - Tuesday, January 4, 2022 - link

    They compared CZN at 15W vs RMB at 28W.
  • blanarahul - Tuesday, January 4, 2022 - link

    Agreed. Would have loved to see a 6 core 12 CU part. Then again they need customers for the 16 CU 6500M and 6300M. It’s not like laptop makers will bundle AMD GPUs with Intel CPUs.
  • nandnandnand - Tuesday, January 4, 2022 - link

    I never expected to see 12 CUs or 6 CUs paired with the 6-core. 10 CUs is a reasonable cut-down, 8 CUs = meh but OK. 6 CUs doesn't make sense from a yield standpoint.
  • JasonMZW20 - Tuesday, January 4, 2022 - link

    Looks like they just cut 1 entire shader array for 6600U. In RDNA, the shader arrays are paired to form one shader engine. This might've been the only clean option that didn't require extra silicon/work.
  • lightningz71 - Tuesday, January 4, 2022 - link

    It appears that, according to the Lenovo Z series laptop release, there will be a Ryzen 7 Pro 6000 CPU with 8CUs (4 WGPs). That may be the home for Rembrandt chips that have one or possibly two failed WGPs, but have both functional shader arrays.
  • dotjaz - Saturday, January 8, 2022 - link

    5WGP might not be possible considering AMD have been cutting them in pairs.
  • nandnandnand - Sunday, January 9, 2022 - link

    A WGP is 2 CUs, right? So they went from 6 to 3. That's not cutting by pairs.
  • CBeddoe - Tuesday, January 4, 2022 - link

    If people are looking for 12 CU they are more likely to be looking for 8 cores. No need to bifurcation for a limited market.
    Devices looking to use the 6 core U CPUs are going to be in lower end Devices with smaller batteries.
  • neblogai - Wednesday, January 5, 2022 - link

    Navi24 dGPUs will be another tier, compared even to 12CU iGPU. What AMD did with going straight to 6CU from 12CU, is upselling, encouraging users that want a good GPU to get at least a Ryzen 7. Previously- there was only a minor ~10% actual gaming performance difference between, say R5 5500U and R7 5700U, and little incentive to pay for the higher tier APU.
  • Spunjji - Wednesday, January 5, 2022 - link

    Agreed. Should have been at least 8, 10 would be even better.

    Best guess is that they're anticipating yield issues with the GPU part of the die.
  • dotjaz - Saturday, January 8, 2022 - link

    They did leave the door open by skipping 6700.
  • lemurbutton - Tuesday, January 4, 2022 - link

    Boring. It's about 2 generations behind Apple and one behind Alder Lake.
  • lemurbutton - Tuesday, January 4, 2022 - link

    Weak ST. No big.Little. Fewer cores than both Apple and Intel. Far weaker GPU than Apple. No ML acceleration. Significantly worse efficiency than Apple M series. Much worse ST/MT than Alder Lake laptop. Weaker performance than an iPhone 13.

    Will make good budget laptops though.
  • Irish_adam - Tuesday, January 4, 2022 - link

    Do you have a source for that? Why does it need big little, intel released big.little because it could not match AMD efficiency? Where are you getting your figures from how is it much worse than alder Lake laptop? There are zero benchmarks out yet and we won't even get the Intel benchmarks till their CES event. Also proof of signicantly less efficient than Apple? Weaker performance than an IPhone 13? Seriously you best be a troll because no one should be that dense and ill-informed.
  • lemurbutton - Tuesday, January 4, 2022 - link

    Check leaked Alder Lake geekbench5 benchmarks.
  • Qasar - Tuesday, January 4, 2022 - link

    yea, cause we all know geekbench is trustworthy and not intel biased.
  • AhsanX - Tuesday, January 4, 2022 - link

    Since when did Geekbench become Intel biased?
  • lemurbutton - Wednesday, January 5, 2022 - link

    Since Intel beat AMD in Geekbench. Then AMD fanboys say Geekbench is biased.
  • Qasar - Wednesday, January 5, 2022 - link

    so says the intel shill

    go read some forums, there are quite a few that question geekbench and how reliable it is. there was an article about it that i was able to find a few months ago, that showed it. heck google and see how many others question how valid geekbench is."

    " Since Intel beat AMD in Geekbench " just like how when intel started to lose in cinebench, even INTEL downplayed its importance ? and down played benchmarks as a whole ? come one lemurbutton, shill harder.
  • dotjaz - Saturday, January 8, 2022 - link

    Are you stupid? Intel CPU has been able to beat Zen2/Zen3 until Alder Lake, how is it not biased if Geekbench favours Intel?
  • Spunjji - Wednesday, January 5, 2022 - link

    Geekbench isn't "Intel biased" per se, but it often performs better than general application performance on their devices - in the case of Tiger Lake and, most likely, Alder Lake, the short span of the testing plays to Intel's strengths (high instantaneous clock speeds) and doesn't reflect their weaknesses (high power draw making it impossible to sustain those clock speeds).

    Wil Alder Lake, it remains to be seen whether the E cores help Intel with performance under long loads. Geekbench results won't tell you that.
  • abufrejoval - Thursday, January 6, 2022 - link

    As you say, the burstiness of Geekbench tends to play to Intel's turbo power: I was a little shocked to see my Tiger Lake NUC reach rather similar results as my Ryzen 9 5950X on the single threaded benchmarks.

    Yet it also reflects the vast majority of workloads like web page renders, that just eat Gigahertz much more readily than cores, while interactive work gives them plenty of time to cool off between burts.

    Ultimately my 8-core 5800U based notebook had a hard time outperforming my 4-core Tiger Lake NUC even on very parallel steady workloads, because those 8 Ryzen cores just had to slow below 2GHz to stay below the 15 Watts of power, I had also imposed on the Tiger Lake NUC beyond 10 seconds to keep the fan off.

    My guess is that the 6-core Ryzen APUs (and CCDs) are mostly having cores disabled for power binning reasons rather than outright defects and that at identical voltages per core an 8-core Ryzen-U would find it very difficult to outperform a 6-core Ryzen-U, because they are too close to the "CMOS knee".
  • Irish_adam - Tuesday, January 4, 2022 - link

    You mean the benchmarks that put alder Lake 12% up on ST and 26% up on MT against 5000 series? So basically identical performance uplift to the 6000 series but probably with worse battery life, power usage and thermals. So yeah I think it is you that needs to do some googling
  • AhsanX - Tuesday, January 4, 2022 - link

    Yeah definitely "26%" faster.

    https://videocardz.com/newz/intel-core-i7-12700h-o...

    Also those numbers in AMD slides are of CZN at 15W and RMB at 28W
  • Spunjji - Wednesday, January 5, 2022 - link

    Cool, leaked figures from the worst benchmark with no indication of device or TDP. Very reliable. You seem like a smart guy and definitely not a shill.
  • Sourav Das - Wednesday, January 5, 2022 - link

    Ryzen 7 5800U was already comparable to the M1. What are talking about ?
  • Spunjji - Wednesday, January 5, 2022 - link

    Some people get paid to do this stuff. Others are just weirdly vexed by specific companies.
  • Spunjji - Wednesday, January 5, 2022 - link

    "Weak ST" - hardly!
    "No big.Little" - doesn't need it.
    "Fewer cores than both Apple and Intel" - More large cores than Intel, though...
    "Far weaker GPU than Apple" - cool, try gaming on Apple - or getting it in a <$1000 device.
    "No ML acceleration" - nobody cares.
    "Significantly worse efficiency than Apple M series" - not actually true.
    "Much worse ST/MT than Alder Lake laptop" - remains to be seen, likely "much" is a lie here.
    "Weaker performance than an iPhone 13" - absolute rot.

    Just more anti-AMD FUD
  • Nate_on_HW - Tuesday, January 4, 2022 - link

    didnt they confirm AV1 though?
    only the roumors said no
  • nandnandnand - Tuesday, January 4, 2022 - link

    AV1 is confirmed. They said it aloud during the presentation. Likely decode only.
  • Xajel - Tuesday, January 4, 2022 - link

    I wish the laptops will be available also with NVIDIA GPU, talking mainly about the G14 and the 16" laptops.

    I use it for content creation where an RTX GPU will give me a huge boost in performance in some apps (Blender, Vray). Blender announced they will support Radeon eventually, and vray doesn't have anything but I honestly can't rely on AMD's support for these compared to NV.
  • Spunjji - Wednesday, January 5, 2022 - link

    There will be plenty with Nvidia GPUs, just as there were with the previous generation.
  • Tams80 - Friday, January 7, 2022 - link

    Because there's a real shortage of Nvidia laptops, right?

    I'll give you that there's nothing quite like the G14 on the Nvidia side this year, but the difference compared to other laptops other than aesthetics is, what? A single replaceable DIMM slot?
  • m53 - Tuesday, January 4, 2022 - link

    Underwhelming CPU performance. For the 35w+ skews the die area would have been better utilized by adding more cores instead of the additional CUs since most of the systems will come with discrete GPUs anyway. Lack of AV1 in 2022 is weird choice. But may be the discrete GPUs will have it?
  • nandnandnand - Tuesday, January 4, 2022 - link

    It has AV1 decode. AnandTech flubbed that line in the article. Maybe meant to say that it doesn't have AV1 ENCODE?
  • lightningz71 - Tuesday, January 4, 2022 - link

    It could potentially be HYBRID encode, where it's some combination of software and some portions in hardware. I doubt it though.
  • brucethemoose - Tuesday, January 4, 2022 - link

    -It has AV1

    -You basically need an IGP to use the laptop on battery.

    TBH, its about time we got a decent IGP, even in the 35W designs with a dGPU, just so you don't always have to fire it up.
  • Spunjji - Wednesday, January 5, 2022 - link

    Adding a whole extra die design means adding a whole lot of cost and requires more design resources. I don't think they're there yet. Zen 4 Raphael will have a mobile implementation that should be more pleasing to people who for, whatever reason, need more than 8 CPU cores in a notebook.
  • Rezurecta - Tuesday, January 4, 2022 - link

    What's with these companies. AMD benches a 28W chip vs a 15W chip. "Hey guys its 2x faster than the previous generation chip!" ..... Yea thanks
  • Spunjji - Wednesday, January 5, 2022 - link

    I think they noticed that Intel got away with it for Ice / Tiger Lake and decided they wanted a piece of that. I'm not excusing them, to be clear - it's transparent bullshit whichever way you slice it. But they probably got tired of being kicked around at 15W while Intel play at 28W.
  • sing_electric - Saturday, January 15, 2022 - link

    All the more reason to have graphs that show total performance (since sometimes that matters, even on a laptop, where in practice thermal constraints based on OEM cooling considerations, and like, is it actually sitting on a lap with your jeans blocking the vents matter), and perf/W.

    AMD likely looks better on perf/W but they've been dinged in the past when previous designs didn't support LPDDR, and then when they did, OEMs were too cheap to implement it, meaning even if the CPU was more efficient, the device wasn't and battery life was worse.
  • techjunkie123 - Friday, January 7, 2022 - link

    I agree that they should have had the TDPs on the comparison slide. But actually to improve performance by 2x while increasing power consumption by ~1.8x is not a trivial achievement, especially without a significant node improvement. My guess is that 6000 series will use more power while boosting higher for CPU / GPU, but the lower idle states will enable more power to be saved overall. Let's see what happens - excited for the reviews.
  • skavi - Thursday, January 6, 2022 - link

    I hope AnandTech can do a head to head comparison of the Thinkpad Z13 and XPS 13 Plus. Ryzen 6800U vs Core 1280P is the most interesting matchup in some time. And now we finally have an AMD notebook which seems to compete with XPS in terms of fit and finish.
  • techjunkie123 - Friday, January 7, 2022 - link

    Yes, and I hope they include performance (CPU / GPU) as well as battery life.

    XPS 13 Plus doesn't include a headphone jack, which is a bad decision imo.
  • adamjon23 - Saturday, January 8, 2022 - link

    Hi thank you for sharing this amazing info !!
  • TekCheck - Sunday, January 16, 2022 - link

    What does it mean DisplayPort 2.0 "ready"? I am always worried when I see the word "ready" remembering "HD ready" label from years ago.

    1. Does RDNA2 iGPU natively support UHBR10 data output at 40 Gbps?
    2. dGPU listing from VideCardz shows support for DP 1.4 only. Is there any level shifter chip on OEM boards that does the job for DP 2.0?
  • Brane2 - Tuesday, January 18, 2022 - link

    With built-in spyware: Microsoft's "Pluto".
    Without anything to give the public insight into what exactly is inside and how it works.
    I'll pass, thanks.
  • FlimFlamSam - Wednesday, February 23, 2022 - link

    Wow. USB4? I missed that spec announcement, but I can finally search specifically for hubs and devices that implement that! excellent!

    I run two ultrawides on a macbook pro, and I had to mothball the two flats I had before, because I could not find sufficiently wide USB3.2/USB3.1gen2/USB3@10Gbps hubs, to get back to a clean, neatly-cabled desk surface. Amazon/Ali/eBay search engines do not sufficiently annotate products to be able to locate devices, without going with a powered dock with a million ports I wasn't gonna use.

    P.S. Seriously, how hard is it to make a 4x or 8x USBC@3.2-10Gbps?

Log in

Don't have an account? Sign up now