Comments Locked

49 Comments

Back to Article

  • TheinsanegamerN - Tuesday, March 29, 2016 - link

    Perhaps tiny increments in performance and high price are killing intel's sales? Why buy skylake when a 5 year old sandy bridge chip can play pretty much anything on the market?
  • tarqsharq - Tuesday, March 29, 2016 - link

    My friend's 3 year old i7-3820's motherboard blew up the last weekend. I was trying to find a motherboard, no luck getting an older DDR3 LGA2011 mobo on the cheap.

    Looked at the benchmarks for the newer chips to show him the performance difference if he upgraded cpu/mobo/RAM.... and not much, not much at all.
  • beginner99 - Wednesday, March 30, 2016 - link

    Don't disagree. Reason to upgrade nowadays is either CPU is older than 5 years or platform is outdated. With a new Mobo you can get USB 3.1, M.2 ssds and so forth depending on how old it is.
  • StrangerGuy - Wednesday, March 30, 2016 - link

    The only thing that really matters is USB3.0. The rest are all (even SATA3 over 2) are luxury features that hardly makes a dent in real world performance unless for niche need circumstances.
  • Achaios - Wednesday, March 30, 2016 - link

    I disagree. Three years after I bought my haswell system, I am still trying to find a use for my USB 3.0 port. So far, I have found none, and I am still using my USB 2.0 ports which came together with my Cooler Master COSMOS case.
  • Strunf - Wednesday, March 30, 2016 - link

    Maybe cause you havent bought any USB hard drives... the difference between transfering files over USB 2 and USB 3 is HUGE!
  • TheinsanegamerN - Wednesday, March 30, 2016 - link

    USB 3 flash drives show the difference better, IMO. But even then, thats a single use case, and it's pretty rare most people have to back up large amounts of data more than once.
  • azazel1024 - Wednesday, March 30, 2016 - link

    I a going to go out on a limb and say a sample size of you isn't indicative of most people. It could be rare, but I doubt it. I don't use my USB3 ports all the darned time, but often enough to be really glad I have them. I used the USB3 port on my tablet keyboard at least once a month for my GbE adapter. I use the USB3 port on my desktop to backup all of my files (they are also on my server, but I like having 3 copies of everything with one of them a cold backup, just in case) once a month. Since that is an easy 25-40GB of data...yeah transferring at 120+MB/sec is really, really nice instead of being stuck at maybe 35MB/sec. I use my USB3 SD card reader at least a couple of times a month to pull a few GBs of files off the SD card for my Olympus OM-D EM-5II camera.

    So yeah, I use the much higher speed of USB3 ports quite a bit.

    On SATAII vs III, I dunno man, I certainly noticed the difference between my old 60GB Vertex II vs 120GB Vertex III drive. It isn't worlds different like going from a HDD to SSD, but application load times, scratch disk performance, etc. was noticeably faster, at least 25-33% in many things (which when it comes to workloads, a second or two saved times hundred times adds up). I'd imagine an even newer SATAIII SSD vs the best SATAII SSD, or running that new SATAIII SSD on an SATAII port would show even more noticeable performance differences. That doesn't even touch on an M.2 SSD that is probably 3-5x faster than an SATAII SSD.

    There is also PCI-e 3.0 vs PCI-e 2.0. Also a lot more PCI-e lanes free on the newer boards (4x PCI-e 2.0 and 16x PCI-e 3.0).

    If you are using an embedded WiFi chipset on the board, an old Sandbridge board probably was running 2:2 802.11n or maybe 3:3 802.11n, whereas a new board probably has 3:3 802.11ac (maybe 2:2).

    DDR4 vs DDR3.

    Probably a few other differences I am not thinking of. Oh and the newer processors to tend to be around 25-40% faster depending on the workload and use less power to do it. Also the dGPU is that is your thing is several times faster.

    I am back on Ivy, but even just the difference between Ivy and Sandy for board features was really nice. Yeah the CPU improves between Sandy and Skylake are realtively mild generation over generation, but they do add up. No in most cases the raw CPU performance means there is nothing Skylake can't run that Sandy won't do a fine job of. If it is workstation for you, the extra processing power is nice. I don't rely on my desktop to performance transcodes in a particularly timely manner. An aggressive transcode taking 3hrs run over night is no skin off my teeth instead of it only taking 2hrs because of a newer and faster processor. I do rely on it for image manipulation, and having something 15-30% faster than my Ivy would be really nice. It would probably shave 5-10 minutes off a 90 minute working session for me.

    There are plenty of people who don't need or want any of that. Plenty of people do want it. I'd like it, but I also feel like I am not hurting that badly between Ivy and Skylake to upgrade my desktop. Depending on what Kaby or Cannon bring to the table, even if it is still just incremental, it might be time for me to upgrade. Something like Skylake-E also might be worth the jump (6 cores would be really, really nice for some of my work).

    I am thinking I'll upgrade my G1610 Ivy server to a Skylake soonish. I want to upgrade it to Windows 10 (actually I don't, but I semi-feel like I need to before cut-off of the free upgrade), and I feel like dropping $200 on hardware isn't a bad deal then. If I can find a not too expensive dual Intel NIC Motherboard, Pentium/Celeron (Celeron Skylake doesn't seem to actually be on the market yet) Skylake processor and either 8 or 16GB of DDR4, it would represent probably a 50-70% increase in CPU performance, all the fancy new chipset features and also I'd imagine a good 10-30% reduction in power consumption (which is more of a sop, but I wouldn't mind running 2-6w less power when the thing is on 24x7)
  • StrangerGuy - Wednesday, March 30, 2016 - link

    "On SATAII vs III, I dunno man, I certainly noticed the difference between my old 60GB Vertex II vs 120GB Vertex III drive."

    Two different SSDs, and you conclude it's the interface instead of the drives themselves that is making the subjective speed up that has totally nothing to do with a concept called confirmation bias. You would make a great scientist, sarcasm very much intended.

    And smart people don't buy things just because of raw speeds.
  • marc1000 - Thursday, March 31, 2016 - link

    agreed. owner of P67 board with usb3 speaking. my desktop has 2 ports and is enough.

    my sandybridge laptop in the other hand, has no usb3 port at all. that's the only reason I consider buying a new one - but as I use it less often, I prefer to save the money.
  • Kutark - Monday, April 4, 2016 - link

    Uh, I'm not sure what you're smoking, but the m.2 pcie absolutely makes a difference outside of niche.

    What's particularly hilarious is that USB 3 is actually relatively unimportant as very few devices are capable of saturating a USB 2 anyways.

    For example Class 10 SD cards and the like are right about the same speed level as USB 2 maxes out at. Really the only scenario I can think of where USB 3 is useful is if you have an external SSD, or a very fast flash drive like the sandisk extreme.
  • yuhong - Wednesday, March 30, 2016 - link

    I think you can thank the Xeon E5-2670 v1 on the market for that.
  • Kutark - Monday, April 4, 2016 - link

    RAM isn't super expensive, if you have to just grab a newer 2011 board, grab 8 or 16gb of DDR4 and call it a day. 3820 is still a fantastic proc.
  • Samus - Tuesday, March 29, 2016 - link

    I was thinking the same thing. Intel is having a hard time getting people to bite. Skylake offers mild performance improvements over Sandy\Ivy and almost no improvement (in the desktop space) over Haswell.

    Overclocking is another issue, too. A lot of people are perfectly content with their 4+Ghz OC's in previous gen systems, and Skylake is a wash because it does NOT have noticeably better overclocking headroom.
  • AnnonymousCoward - Tuesday, March 29, 2016 - link

    > Skylake offers mild performance improvements over Sandy

    I don't get why people say this. Skylake is 40-80% faster than the 2600K according to this:

    http://www.anandtech.com/show/9483/intel-skylake-r...
  • tarqsharq - Tuesday, March 29, 2016 - link

    Because most people on this site are looking at this page:
    http://www.anandtech.com/show/9483/intel-skylake-r...
  • nevcairiel - Wednesday, March 30, 2016 - link

    The problem with gaming of course is not that the CPUs are not faster, just that the games don't necessarily need more CPU performance. Maybe this changes with a new generation of GPUs, maybe it doesn't.
  • OrphanageExplosion - Wednesday, March 30, 2016 - link

    These benchmarks are GPU bound and pretty much useless for CPU testing. Benchmarking a CPU for gaming is pretty tricky - you need to find CPU-bound areas, then you need to find a repeatable test.

    The funny thing is of course that Anand's non-gaming benchmarks show a huge leap. The question this site needs to ask itself is why one computational load (non-gaming benchmarks) shows a huge increase while another (gaming) does not.

    The answer is pretty straightforward - for the vast majority of the duration, it's not the CPU that is being tested. Extending that to the non-gaming benches - imagine something like a WinRAR test on actual files hosted on a 5400rpm mechanical drive. You'd get similarly inaccurate results for the same reason.

    Simple test for Anand - change your GTA settings so all the advanced settings are maxed. That creates a CPU load not even a 5960X can handle when paired with a 980 Ti. Then try some other games that do actually max the CPU. They're not difficult to find but they don't have canned benchmarks.
  • TheinsanegamerN - Wednesday, March 30, 2016 - link

    In games, the difference is much smaller. The problem is that benchmarks do not equal real life performance.

    I agree that skylake is impressive, but when sandy bridge can play anything out today, and core 2 duos are more than enough for people that dont play games or do intensive computations, intel is in trouble.
  • Ubercake - Wednesday, March 30, 2016 - link

    My sentiments, exactly.

    I used to like upgrading my CPU every few years, but I'm finding less compelling reasons to do so. I have no reason to get rid of my 3930K any time soon and there's still plenty of bandwidth and then some for my GTX 980 on PCI-e 2.0. I do find I'm running out of SATA 3.0 connections as time passes, but the larger size SSDs are coming down in price and are replacing my older smaller capacity SSDs.

    I think Intel's lack of big performance jumps is due to AMD's lack of a competing product.

    The good thing is it's saving me hundreds of dollars every few years. There's really no reason to upgrade any time soon. Maybe the good ol' 3930K will be a 10-year processor? That would be insane. I have never experienced such a lack of need to upgrade a processor/motherboard combo for such a long period of time until now.
  • SunLord - Wednesday, March 30, 2016 - link

    I'm still rocking a Phenom II X6 1055T and 2 7970s and I've not upgraded because I'm waiting for the high-end 14nm GPUs from AMD and Nvidia to ship. I figured by the time those ship I can pick between Intel's Kabby Lake and AMD's Zen or maybe Broadwell-E. It's been almost 6 years and the X6 is really starting to show it's age though the 7970s were fine until I moved to a 21:9 monitor
  • Gunbuster - Tuesday, March 29, 2016 - link

    6700K is $339.99 at Micro Center brick and mortar but you have to pay sales tax in your locality...
  • nathanddrews - Tuesday, March 29, 2016 - link

    But you also get $20-50 a compatible motherboard, so it usually still works out.
  • nathanddrews - Tuesday, March 29, 2016 - link

    *off a compatible motherboard...
  • AnnonymousCoward - Tuesday, March 29, 2016 - link

    Yep, Microcenter rules.
  • ddarko - Tuesday, March 29, 2016 - link

    Anyone looking for an i7-6700k who can get to a Microcenter should head to one; its on sale for $340 as of the time of this post.
  • JoeyJoJo123 - Tuesday, March 29, 2016 - link

    No. Don't even bother.

    Online price at B&H
    $363.95 base + $0 tax + $0 shipping = $363.95, until it's at your doorstep.

    In-store price at Microcenter
    $339.99 base + (8.25% tax in my area) $28.05 tax + $??? gas = $368.04 + gas, until it's at your doorstep.

    Microcenter is only worthwhile when they're a big loss leader in CPU prices. (Example: Black Friday sale point on an Intel unlocked processor.) Besides that, 6700k are an awful value proposition for any person making a PC solely for gaming. ($230 for 6600k vs $360 for 6700k, you pay 50% more, but you get anywhere between 10% ~ 20% performance increase on heavily multithreaded tasks only.)

    I still don't understand why so many people make the mistake of going for the 6700k + GTX 970 for gaming, when they'd be much better off with a 6600k + GTX 980 for gaming.
  • bigboxes - Tuesday, March 29, 2016 - link

    Maybe gaming isn't their only interest.
  • AnnonymousCoward - Tuesday, March 29, 2016 - link

    Because the CPU needs to have a much longer lifespan, and Pascal will destroy the 970 or 980 equally.
  • TallestJon96 - Wednesday, March 30, 2016 - link

    Exactly. My reasoning behind my new i76700 + gtx 970 combo is I will have fantastic gaming performance for years by only upgrading the GPU. I'll go through what, 3 GPUs before an i7 with ddr4 isn't enough? It's worth the money in the long run for sure.
  • Impulses - Tuesday, March 29, 2016 - link

    If it's solely for gaming yeah, it's a dubious allocation of your budget... I use my PC for Lightroom and other photography tasks tho, they're multithreaded but not super well...

    A fast quad that OCs better and comes with a more modern platform ends up being the sweet point compared to a hex or a part without HT. I expect, given the long upgrade cycles, that this will be the last time the happens tho.
  • Nottheface - Thursday, March 31, 2016 - link

    You should have got it on 3/14 - 3/16 like I did for $314.15 at Microcenter. It prompted me to finally give in and make a new PC.
  • Arnulf - Tuesday, March 29, 2016 - link

    How many different Skylake dies does Intel manufacture? (including mobile/SoC variants)
  • hojnikb - Tuesday, March 29, 2016 - link

    i bet they split the up according to iGPU/core count.

    2core/hd510-530 get their own die
    4core/hd10-530 own
    2/4core/iris 550
    4core/iris 580

    Something like that would make sense.
  • Shadowmaster625 - Tuesday, March 29, 2016 - link

    It is still an insane ripoff. It costs $6 per square millimeter of usable silicon. (Not counting the area consumed by the IGP because it is useless.) Since when is $6 per square millimeter a fair price?
  • bigboxes - Tuesday, March 29, 2016 - link

    Don't buy it then. We all have to do a self-evaluation on price/performance that matches our need(s).
  • AnnonymousCoward - Tuesday, March 29, 2016 - link

    Are you buying blank silicon, or did Intel happen to spend $50 Billion to develop the technology in it?
  • willis936 - Tuesday, March 29, 2016 - link

    Intel happened to dump that money into it so people will happily accept it when the dies get half the size and twice the cost overnight.
  • Kutark - Monday, April 4, 2016 - link

    Like usual, you are making a fundamental misunderstanding of how these business operate. When you're talking about something like say, spatulas, then yes, materials and labor is the majority of the production costs. When you're talking about something like say pharmaceuticals, the vast majority of the cost is R&D recoupment. That's why an oculus rift which has maybe $100 in materials costs is $800 dollars, because they spent a ton on R&D.
  • jwcalla - Tuesday, March 29, 2016 - link

    Too expensive for what you get.
  • rm19 - Tuesday, March 29, 2016 - link

    Best Buy you suck
  • BrokenCrayons - Wednesday, March 30, 2016 - link

    I'd much rather see more 128MB Iris Pro CPUs on the market for desktops than bother wasting time with an unlocked multiplier for which there's a lot less of a point. CPU performance hasn't been as relevant or important since Sandy Bridge, but GPU performance remains a significant concern. In turn that means that an unlocked and overclockable CPU offers far fewer rewards in a smaller number of use cases compared to additional GPU power. Intel needs to dedicate a lot more die area to graphics as desktop computers become progressively smaller.
  • Pissedoffyouth - Wednesday, March 30, 2016 - link

    >I'd much rather see more 128MB Iris Pro CPUs on the market for desktops

    Yup, this. I want to build a new APU gaming PC to replace my AMD APU
  • AnnonymousCoward - Wednesday, March 30, 2016 - link

    Why don't you just get a discrete GPU.
  • BrokenCrayons - Thursday, March 31, 2016 - link

    For the moment, many of us are obliged to do just that. I have a low profile GDDR5 GeForce GT 730 that suffices for any games I currently play, but I'd prefer fewer additional PCBs, smaller motherboards and cases, and fewer cooling fans. My current gaming PC has a standard sized ATX motherboard in big clunky Lan Li case on wheels. It's shoved in a corner next to my router where I don't have to look at it and I game on it exclusively through Steam in home streaming. If I had my way, I'd rather have a NUC-sized system do that work so instead of being in the way of my cat's food dish, it could be placed adjacent to my router on the wall where network appliances belong. That way inattentive children, lovable but inattentive spouses, and pets are less likely to blunder into my candy crusher machine.
  • AnnonymousCoward - Thursday, March 31, 2016 - link

    A quick check shows a reasonable discrete GPU is 8 times faster than the Intel 530 iGPU. Today's discrete is just fast enough to run 4K@60Hz. 1080p60 is 1/4th, so an iGPU couldn't even handle that for a lot of games. Then there are faster framerates like 144Hz or the Rift's 90Hz. iGPU performance isn't even close.

    Plus a CPU and iGPU need to share their thermal budget.

    Unless you live in 1 room, I wouldn't think 1 cubic foot of difference between ATX and a NUC would be that big of a deal. But for an HTPC I get it.
  • BrokenCrayons - Thursday, March 31, 2016 - link

    A huge number of people don't care about 4K in the slightest or have any interest in VR. I understand that it's important and for people that want to play games at high resolutions, I think they ought to be free to continue purchasing the hardware necessary to do such things. I also would love to see the tech improvements necessary to make high resolution entertainment and VR move down to middle and low end graphics processors. However, for someone like me who is perfectly happy playing the occasional older title at 1366x768, an iGPU like the 540 or 530 is actually a pretty effective solution regardless of whether or not a thermal budget is being shared.

    While I have a lot of square footage available to me, I personally don't want the unsightly desktop laying around in plain view. My home is nicely decorated since I like to dabble in interior design as a hobby, but desktop computers in particular are not easy to shape into the overall room design. They're usually large, monolithic objects with an octopus of wires jutting from the backside that need to run places in plain view. Unless you decorate in darker colors to match cables and cases (which looks to gothic and dreary in my opinion) then computing components tend to stand out. A laptop can be closed, unplugged and put into an end table drawer easily and a headless NUC-sized PC can be put someplace out of the way more easily than a full or even micro tower. While I have enough money and leisure time to purchase or even paint a case, it's not something I want to invest time into doing when there are other options looming on the horizon that would give me more time to do the things I love. After all, saving us time is one of those things computers were intended to do from the outset. It was one of the main pitch methods when I bought a Tandy RL...those time savings...at least until getting a multimedia computer was a big deal and that became the main selling point.
  • AnnonymousCoward - Friday, April 1, 2016 - link

    My tower sits on the opposite side of my desk, near the wall, so it's not visible unless you go over there. It also has slow 140mm fans in an R5 case making it virtually silent. It's invisible to the environment. That's better than a NUC in plain sight, or a NUC you have to put away when you're not using.

    It also makes no compromises by having an optical drive, 4.4GHz Skylake, and discrete GPU (which is also upgradable for many years to come).

    Games at 1366x768 must look awful at full-screen, or tiny if displayed at 1:1 pixels. No thanks.
  • Kutark - Monday, April 4, 2016 - link

    Gaming APU, that's cute.

Log in

Don't have an account? Sign up now