I'm a web developer myself, I run loadsa containers, I can easily make do with 8 gigs of memory - your dev environment should resemble your production one, but you don't need to run elasticsearch with a 20 gig index in dev. What I'm really trying to point out is that more web developers should have empathy towards ram - there is no legitimate reason why slack needs upwards of a gigabyte of memory to deliver the same functionality weechat or irssi deliver using less than 10 megabytes of memory.
I think he meant you should be using an older system to have some motivation to actually bother optimizing your code.
Yeah, I'm sure none of the self proclaimed developers that comment here are the kind of lazy bums who expect everybody to put 8 cores and 64GB or RAM in their machines just to run their crappy app or website. I'm just talking about every single other developer who does this. :)
Having loads of RAM is not exactly necessary to browse a website. Rather there should be plenty of RAM on the server machine, to handle multiple requests at the same time.
Of course, this applies to sites not overloaded with JavaScript libraries. But then again, good libraries don't take gibabytes of RAM in client browser...
Oh plenty of websites somehow consume enormous amounts of memory without delivering any special functionality. Also the fact that browsers are not optimized doesn't help. Yes, it may be unfortunate that all developers are put in the same bucket but there's some truth to that.
Developers are using overpowered machines to make their work faster but many will expect that the apparent performance they see will be the same at the user. How do you think browsers ended up needing that kind of memory usage? It's because both the browser and the web developers lazied out and said "the user can just slap another stick of RAM in there".
Obviously you have a very limited understanding of what it takes to be a good web developer... for the small time stint i did as a front-end architect/web dev, i needed multiple concurrent docker containers to host various things including database, caching database, web server, wordpress (for client embeds). Then there is compiling code - css generation, framework code, ... i have personally pushed 16GB+ RAM getting end to end services locally - not a feat many web developers do or have to. 18C/128GB seems like over the top for web dev but i presume this system isn't limited to web devs.
I can imagine small-time videos/photos editors benefit from such power. i can also imagine 3d graphics editors too.
all of these use cases can do without a high standard for color re-production and rather just want the horsepower in a small footprint with higher resolution.
Most professional users would be hooking up a second monitor to this anyway, which they likely already own. The definition of a "professional monitor" varies pretty widely depending on the user's exact profession. A colorist has very different needs than an editor. At a certain point they can't go too nuts on the monitor because they'll be driving the price up on lots of features that most buyers don't need.
I would argue the professional market really wants a user-upgradable Mac Pro that doesn't suffer from the heating and performance limitations of the trash cans, but this looks like a decent stopgap solution until that happens, especially for creatives who just hate using Windows but can't find enough horsepower in the older iMacs.
Yeah I was thinking that as I was reading this... professionals who are diehard Mac would probably be better served with a non-integrated, upgradeable setup. Especially since they supposedly start at $5000.
The GPU is already underclocked relative to the normal desktop versions so there probably won't be much if any throttling.
"The company says that Mac Pro’s cooling system can cope with up to 500 W of heat, so it cannot use a 140 W CPU and a 295 W GPU in order to avoid overheating."
140+295=435W and they don't seem to think it can actually cool the fully clocked Vega so I kind of doubt the 500W number.
And again I wonder who is the target audience of this machine - it cannot run CUDA based programs, it cannot run latest game in 4K, it cannot run professional graphics software (no quaddro/firepro GPU).
It still has xeon CPU and ECC ram... I'm curious who?
People like me. I’m a developer and do a lot of iOS work these days so I need a Mac. I don’t actually want ECC or the big GPU but I need all the cores I can get. I’ll bide my time and see what the Mac Pro offers in 2018-2019 and then pick one of those up, or one of these. At a price sure but it’s how I earn my livliehood.
(I’ve done the Hackintosh thing and the cost is enticing but I need more reliability, and flexibility to upgrade software.)
That's the problem with Apple. You don't get to choose. Most people who will end-up buying this machine could do just fine with a half the price one, if it was offered.
Yeah it sucks sometimes. I don’t mind ECC, for me it’s like a bonus though since I don’t need it. Comes with Xeon in my mind (not sure if Xeon requires it or not). However the GPU bugs me because I know it’s actually a large part of the price.
Ultimately I don’t truly need a 8+ core Xeon though, I only want it. My quad core 2014 iMac is still pretty decent.
I did the Hackintosh thing too for a couple of years...I finally came to the conclusion that even a slower "REAL MAC" is more productive. A Mac is a requirement for an iOS developer, but even if you are merely an iPhone user, a Mac is a better PC than a Windows PC ... especially if you do all your gaming on a console with a BIG SCREEN TV experience.
Since he qualified his statement with "iPhone user", I'm guessing he means the handoff tech, a less awful iTunes, the ability to use Messages on your desktop, things like that.
It's not enough to convince me to spend the money, but if you don't game and basically just do casual user stuff, the added convenience might tip the scales for some.
Yeah, when Windows Mobile crashed and burned, I eventually landed on iPhone. I find the Mac is a very nice compliment to it, as I can send and receive text messages (not just iMessages) from my desktop. Browser syncing is very good, and no Google is needed. Considering Office 365 and Lightroom keys work on either platform, there's really not much for me to miss from Windows. I game on a console, while the Mac is for my desktop work. It's an old 4,1 Mac Pro that I've flashed and upgraded. I don't really ever use iTunes--it's just not required anymore in the land of streaming music.
In the interim I would suggest a Dell XPS 15 or ASUS UX5XX series. Put OSX on a Samsung T5 and run as a VM. Works great, Very stable, no issues. Saves you a boatload of money you can use to get 4k monitor, a docking station, and some Arby's. Do a couple of searches and you can find all kinds of info about it. If you want to know the exact setup let me know.
Not quite unfortunately. Radeon pro doesn't sport the high FP64 throughput of the firepro, and it remains to be verified whether it is end to end ECC.
For most third party software it will do, like cad and stuff. Even consumer cards do as long as you don't push the limits of what the software can handle.
Most content production software also has opencl codepaths, so the lack of cuda is not a show stopper.
"Meanwhile it’s interesting to note that while Vega 10 is a replacement for Fiji, it is not a complete replacement for Hawaii. 2013’s Hawaii GPU was the last AMD GPU to be designed for HPC duties. Which is to say that it featured high FP64 performance (1/2 the FP32 rate) and ECC was available on the GPU’s internal pathways, offering a high reliability mode from GPU to DRAM and back again. Vega 10, on the other hand only offers the same 1/16th FP64 rate found on all other recent AMD GPUs, and similarly doesn’t have internal ECC. Vega 10 does do better than Fiji in one regard though, and that’s that it has “free” ECC, since the feature is built into the HBM2 memory that AMD uses. So while it doesn’t offer end-to-end ECC, it does offer it within the more volatile memory. Which for AMD’s consumer, professional, and deep learning needs, is satisfactory."
I am still rocking a set of w8100 - there is simply no replacement for them. They currently go at about 900$ for 2.1 tflops, which is the best value product that has been available for a few years.
Although if the titan v lives up to the hype I will be tempted to replace them with a single card, value is almost identical compute wise, but I will get a nice bump in compute and especially in graphics performance and serious power efficiency improvement. Quite frankly the only thing still holding be off is the lack of finalized opencl 2.x support.
Video editors using Final Cut Pro X are almost certainly a huge part of the target audience.
Apple has so far given only 1 live demo of this new machine, at the FCPX Creative Summit last October. Among the demos was the upcoming 10.4 version of FCPX editing unrendered 8K video.
I would think at least 27". I imagine most use as big as they can reasonably fit in their working space. More screen space you can see clearly without squinting is always valuable.
None of our developers use less than two or three monitors in the 27"+ category. Between Xcode, Terminal.app sessions, Console.app, the (Watch/iPhone/iPad) Simulator, an e-mail client, web browser and a Git/SVN client, a single display just doesn't cut it, not even close.
5K at 27" is the sweet spot for a productivity focused machine like this. I have a 5K iMac at work and absolutely love it! Retina displays of this size are an absolute joy!
Once you go 27" with a(n effective) resolution of 2560x1440, you won't want to go back. Do I prefer 16x10 rectangles aesthetically? Yes, but that ship has sailed.
If you walk into Facebook, Google, Microsoft (sans Apple products below), or Dropbox - 99% of all developers have pretty much the following setup:
Laptop: usually a MBP or occasionally a Thinkpad 27in display: used to be Cinema Display, now the Dell UltraSharp 27 InfinityEdge's are becoming popular
27 is a nice size, economical (not an obscure size to contend with and Intel GPU's can handle 2560x1440), and a sizable bump up from 24in.
I can't speak to designers needs, but I'd imagine that unless the designers are doing super color specific work (photos), much of the modern design work could be done on the above UltraSharps (vector art, web media, etc.)
Source: I used to be in the consulting industry and regularly worked with those companies.
Those Youtubers bring more customers and promotion for the product than AT will anytime soon. AT is almost exclusively targeted at US with basically 0 presence in Europe or Asia, while vloggers have a worldwide reach. And the vast majority of AT traffic comes from searches, people looking for a review long after the novelty of the product is gone. If promotion is what you're after a website like AT is the last place to do it.
The breakout slide from the 'stuff we didn't have time to show' thing they do said dual SSD modules, wonder what that's about. Just for the 4TB, 2+2? RAID or simple spillover?
If the A-series SoC is confirmed, it could be in order to drive a potential proprietary keyboard with a similar touch sensor top row like the MacBook Pro (forgetting the name of it but that thing).
at over $13,000 I think I would be inclined to look elsewhere - but maybe it can cook, clean, and do my yard work too? otherwise it's just more overpriced apple shenanigans that, of course, is cool now but will soon be relegated to the designer boneyard just like all their previous miracles.
"While it looks like Apple is going to use standard memory modules, the iMac Pro does not seem to be user-upgradeable, unlike regular iMacs."
This is not completely true [use of "standard" memory modules]. People inside Apple have claimed that - many DIMMs on the market are somewhat out of spec - this matters when you are trying your utmost to run the machine at lowest power and lowest fan noise, because you rely on the DIMM specs to calibrate how low you can go.
Assuming this to be true, it's possible that they have a reason (you decide whether it's a "good" reason or not) for not allowing in 3rd party DIMMs simply because to do so would be to require the machines to burn a few percent more power and/or to run the fan a few percent faster to cover the unknown quality of those DIMMs.
Now this won't calm down the crowd that insist everything does is a conspiracy, but may clarify the issue for the more rational readers.
I can't attest to the full truth of this, but I can say that in my experience - one laptop to which I added third party RAM, I then needed to add a menuling that drove the fan faster than the OS wanted to drive it, because otherwise the machine was on the edge of crashing from overheating if I ran an extended period of heavy duty computation - when I added what was supposedly 1600MHz DDR3 to my iMac 27", the mac downgraded the previous 1600MHz memory speed to 1333.
Both of these suggest some truth to Apple workers' claims about the dodginess of 3rd party RAM.
(And yes, I would be the first to agree that having a hardware team that is working so hard to ensure that the HW is bulletproof at the same time that the SW team is doing their best to ensure that the OS and graphics stacks crash at least once a day [certainly on older macs] is very depressing. All true, but orthogonal to issues of what the hardware is and why it is that way.)
"The company says that Mac Pro’s cooling system can cope with up to 500 W of heat, so it cannot use a 140 W CPU and a 295 W GPU in order to avoid overheating."
OK, to add to the above. One problem Apple has (again, not trying to make excuses here, trying to explain the situation) is that Apple gets the reputation failure when sub-components go bad, not the sub-component manufacturer. So you can ask why does Apple not provide a 600W cooling system and run the GPU at full power?
Apple would not say so in public, but they may well believe that AMD's claims that it is OK to run the GPU at 300W are simply BS. The graphics card in a large number of iMacs of around the 2007 vintage went bad after about five years because of the heat they produced. AMD may be more or less correct in saying that the part can run for 5 years at 300W, while Apple can ALSO be more or less correct in feeling that, regardless of warranty issues, they want people to feel that Macs (especially EXPENSIVE macs) last as long as they are used, and it's bad for the brand reputation if there are a large number of stories in five or six years about how so many iMac Pros are dying because their graphics cards are going bad.
I didn't understand exactly why anyone else than someone using only Apple sw will need this thing. Here a confirmation with double the power, high quality components and same price https://pcpartpicker.com/user/davide445/saved/fPTN...
davide445: Great alternative iMac Pro setup you chose there. But your particular configuration is way more powerful (CPU & GPU) than the entry level 5K+ iMac Pro. Your setup would probably match (sans RAM total & SSD size), the top of the product line 10K+ iMac Pro. And with both the iMac Pro's CPU & GPU being either custom LP parts, as in: B versions of the W-Xeon's, along with underclocking the custom AMD GPU to satisfy Apple's 500w power envelope - even the best iMac Pro that you can buy at up to 15K+ (with tax and a obligatory Applecare warranty), your Nvidia Titan still will easily outperform the iMac's downclocked Vega 64 by a relatively large margin. And the best thing about your pcpartpicker rig is that almost all of the parts you selected can be easily upgraded as your need and/or financial means change in the next couple of years. The iMac Pro is basically: what you purchase, you are forever stuck with - regardless of how much money you might have paid, or what your future equipment requirements may be.
Apple has done it again: cramming and throttling and soldering all of the internals to fit into a sealed (non-user upgradable) package that they will probably have to do another mea culpa all over again in 3 years time to try and explain to everyone who bought the now obsolete AIO's that they just didn't forsee the hardware issues that arose from insisting that everything in Apple's ecosystem conform to their extreme anorexic design regiment.
What's the point of your exercise? Buying parts and assembling a system will always be cheaper than buying an OEM one. Your calculation is junk anyway, I can buy it from Ebay for less than $4K ;).
Do you think a company or anybody buying 5 or 10 or more of these iMacs would ever bother assembling a system by hand? Turn on your brains before you post. This BS idea that you can build it yourself and it's cheaper than the preconfigured system needs to die already. Laptops or servers fall into the same category. Maybe you're going to start wondering why companies buy HP or Dell or Cisco servers instead of assembling them with parts from Newegg,
And BTW, if you look at offers from HP or other OEMs you'll see the same kind of prices: they start at ~$2500 for 4 core Xeons, 16GB of RAM and 512GB SSD. And that doesn't include the 5K display and the clean design.
Sorry, but nope. You already show your hand by comparing a Xeon with an AMD CPU. I guess you just look at the number of cores and GHz to compare. That doesn't cut it. And memory bandwidth is much lower. And SSD bandwidth is lower.
If you don't understand what the iMac Pro was designed for, and what it can do, you'll just continue to post such ridiculous comparisons.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
76 Comments
Back to Article
palladium - Wednesday, December 13, 2017 - link
Nice, but the display isn't true 10 bit? Don't know how video/photo professionals will react to that...Flunk - Wednesday, December 13, 2017 - link
I dunno, but it would be fine for developers, web designers, anyone who's target market doesn't include people with 10bit screens.nevertell - Wednesday, December 13, 2017 - link
Web developers should be forced to use a Macbook from 2011, not something with up to 128 gigabytes of memory.lmcd - Wednesday, December 13, 2017 - link
Web developers include people working on projects consisting of WebGL, WebAssembly, WebMIDI, Web Workers, and IndexedDB. Not the same world anymore.nevertell - Thursday, December 14, 2017 - link
I'm a web developer myself, I run loadsa containers, I can easily make do with 8 gigs of memory - your dev environment should resemble your production one, but you don't need to run elasticsearch with a 20 gig index in dev. What I'm really trying to point out is that more web developers should have empathy towards ram - there is no legitimate reason why slack needs upwards of a gigabyte of memory to deliver the same functionality weechat or irssi deliver using less than 10 megabytes of memory.ddrіver - Thursday, December 14, 2017 - link
I think he meant you should be using an older system to have some motivation to actually bother optimizing your code.Yeah, I'm sure none of the self proclaimed developers that comment here are the kind of lazy bums who expect everybody to put 8 cores and 64GB or RAM in their machines just to run their crappy app or website. I'm just talking about every single other developer who does this. :)
WinterCharm - Saturday, December 16, 2017 - link
Yeah, those are testing systems. not development systems.Develop on a fast/good machine. Test on cheap shit.
SeenQuiteALot - Friday, December 15, 2017 - link
Having loads of RAM is not exactly necessary to browse a website. Rather there should be plenty of RAM on the server machine, to handle multiple requests at the same time.Of course, this applies to sites not overloaded with JavaScript libraries. But then again, good libraries don't take gibabytes of RAM in client browser...
ddrіver - Friday, December 15, 2017 - link
Oh plenty of websites somehow consume enormous amounts of memory without delivering any special functionality. Also the fact that browsers are not optimized doesn't help. Yes, it may be unfortunate that all developers are put in the same bucket but there's some truth to that.Developers are using overpowered machines to make their work faster but many will expect that the apparent performance they see will be the same at the user. How do you think browsers ended up needing that kind of memory usage? It's because both the browser and the web developers lazied out and said "the user can just slap another stick of RAM in there".
Zingam - Sunday, December 17, 2017 - link
There is this one web site that eats up 25% CPU even if you just look at the page!XZerg - Wednesday, December 13, 2017 - link
Obviously you have a very limited understanding of what it takes to be a good web developer... for the small time stint i did as a front-end architect/web dev, i needed multiple concurrent docker containers to host various things including database, caching database, web server, wordpress (for client embeds). Then there is compiling code - css generation, framework code, ... i have personally pushed 16GB+ RAM getting end to end services locally - not a feat many web developers do or have to. 18C/128GB seems like over the top for web dev but i presume this system isn't limited to web devs.I can imagine small-time videos/photos editors benefit from such power. i can also imagine 3d graphics editors too.
all of these use cases can do without a high standard for color re-production and rather just want the horsepower in a small footprint with higher resolution.
deil - Thursday, December 14, 2017 - link
Just push your env to 6 clients with same codebase, multi language support and you will go up to 64 G ram. Makes life easier ....nerd1 - Thursday, December 14, 2017 - link
Okay I understand you need a powerful PC developing web pages... then why on the earth are you using mac? There are no such thing as a powerful Mac.jimjamjamie - Thursday, December 14, 2017 - link
Did you scroll directly to the comments without reading the article?StevoLincolnite - Sunday, December 17, 2017 - link
He's not wrong though. The Mac pales in comparison to a well equipped PC.Only having a single 18 core processor? Common. ;)
Flunk - Wednesday, December 13, 2017 - link
Yeah, the company really wants the developer sitting there waiting for it to build for 15 minutes every time. That makes financial sense.gerz1219 - Wednesday, December 13, 2017 - link
Most professional users would be hooking up a second monitor to this anyway, which they likely already own. The definition of a "professional monitor" varies pretty widely depending on the user's exact profession. A colorist has very different needs than an editor. At a certain point they can't go too nuts on the monitor because they'll be driving the price up on lots of features that most buyers don't need.I would argue the professional market really wants a user-upgradable Mac Pro that doesn't suffer from the heating and performance limitations of the trash cans, but this looks like a decent stopgap solution until that happens, especially for creatives who just hate using Windows but can't find enough horsepower in the older iMacs.
nerd1 - Wednesday, December 13, 2017 - link
But I think you need firepro/quadrro GPU to drive 10 bit monitor...Flunk - Wednesday, December 13, 2017 - link
Gee, it's a good thing this has that...Radeon Pro replaced Fire Pro as AMD's professional graphics branding last year.
Alexvrb - Thursday, December 14, 2017 - link
Yeah I was thinking that as I was reading this... professionals who are diehard Mac would probably be better served with a non-integrated, upgradeable setup. Especially since they supposedly start at $5000.nerd1 - Wednesday, December 13, 2017 - link
Wonder how much this hot thing throttles under load *grabs popcorn*notashill - Wednesday, December 13, 2017 - link
The GPU is already underclocked relative to the normal desktop versions so there probably won't be much if any throttling."The company says that Mac Pro’s cooling system can cope with up to 500 W of heat, so it cannot use a 140 W CPU and a 295 W GPU in order to avoid overheating."
140+295=435W and they don't seem to think it can actually cool the fully clocked Vega so I kind of doubt the 500W number.
nerd1 - Wednesday, December 13, 2017 - link
And again I wonder who is the target audience of this machine - it cannot run CUDA based programs, it cannot run latest game in 4K, it cannot run professional graphics software (no quaddro/firepro GPU).It still has xeon CPU and ECC ram... I'm curious who?
fred666 - Wednesday, December 13, 2017 - link
hipsters with a lot of moneysamsonjs - Wednesday, December 13, 2017 - link
People like me. I’m a developer and do a lot of iOS work these days so I need a Mac. I don’t actually want ECC or the big GPU but I need all the cores I can get. I’ll bide my time and see what the Mac Pro offers in 2018-2019 and then pick one of those up, or one of these. At a price sure but it’s how I earn my livliehood.(I’ve done the Hackintosh thing and the cost is enticing but I need more reliability, and flexibility to upgrade software.)
fred666 - Wednesday, December 13, 2017 - link
"I don’t actually want ECC or the big GPU"That's the problem with Apple. You don't get to choose.
Most people who will end-up buying this machine could do just fine with a half the price one, if it was offered.
MonkeyPaw - Wednesday, December 13, 2017 - link
Wouldn’t that just be the 5K iMac then?samsonjs - Thursday, December 14, 2017 - link
Yeah it sucks sometimes. I don’t mind ECC, for me it’s like a bonus though since I don’t need it. Comes with Xeon in my mind (not sure if Xeon requires it or not). However the GPU bugs me because I know it’s actually a large part of the price.Ultimately I don’t truly need a 8+ core Xeon though, I only want it. My quad core 2014 iMac is still pretty decent.
nerd1 - Wednesday, December 13, 2017 - link
Then you may be just fine with current or next (coffee lake equipped) iMac.samsonjs - Thursday, December 14, 2017 - link
Oh yeah, I’m pretty good with my quad core iMac really. Just obsessed with speed and love having a fast machine. I mean, who here doesn’t really.TEAMSWITCHER - Wednesday, December 13, 2017 - link
I did the Hackintosh thing too for a couple of years...I finally came to the conclusion that even a slower "REAL MAC" is more productive. A Mac is a requirement for an iOS developer, but even if you are merely an iPhone user, a Mac is a better PC than a Windows PC ... especially if you do all your gaming on a console with a BIG SCREEN TV experience.nerd1 - Wednesday, December 13, 2017 - link
How the hell a Mac better PC than windows PC?Old_Fogie_Late_Bloomer - Wednesday, December 13, 2017 - link
Since he qualified his statement with "iPhone user", I'm guessing he means the handoff tech, a less awful iTunes, the ability to use Messages on your desktop, things like that.It's not enough to convince me to spend the money, but if you don't game and basically just do casual user stuff, the added convenience might tip the scales for some.
MonkeyPaw - Wednesday, December 13, 2017 - link
Yeah, when Windows Mobile crashed and burned, I eventually landed on iPhone. I find the Mac is a very nice compliment to it, as I can send and receive text messages (not just iMessages) from my desktop. Browser syncing is very good, and no Google is needed. Considering Office 365 and Lightroom keys work on either platform, there's really not much for me to miss from Windows. I game on a console, while the Mac is for my desktop work. It's an old 4,1 Mac Pro that I've flashed and upgraded. I don't really ever use iTunes--it's just not required anymore in the land of streaming music.Manch - Thursday, December 14, 2017 - link
In the interim I would suggest a Dell XPS 15 or ASUS UX5XX series. Put OSX on a Samsung T5 and run as a VM. Works great, Very stable, no issues. Saves you a boatload of money you can use to get 4k monitor, a docking station, and some Arby's. Do a couple of searches and you can find all kinds of info about it. If you want to know the exact setup let me know.Zarniw00p - Wednesday, December 13, 2017 - link
Radeon Pro = Fireproddriver - Wednesday, December 13, 2017 - link
Not quite unfortunately. Radeon pro doesn't sport the high FP64 throughput of the firepro, and it remains to be verified whether it is end to end ECC.For most third party software it will do, like cad and stuff. Even consumer cards do as long as you don't push the limits of what the software can handle.
Most content production software also has opencl codepaths, so the lack of cuda is not a show stopper.
tipoo - Wednesday, December 13, 2017 - link
If the Radeon Pros in the Macbooks are any indication, they're pretty well lower clocked consumer cards with no ECC.Manch - Thursday, December 14, 2017 - link
1/16th FP64 and no end to end ECC according to Anandtech.https://www.anandtech.com/show/11717/the-amd-radeo...
"Meanwhile it’s interesting to note that while Vega 10 is a replacement for Fiji, it is not a complete replacement for Hawaii. 2013’s Hawaii GPU was the last AMD GPU to be designed for HPC duties. Which is to say that it featured high FP64 performance (1/2 the FP32 rate) and ECC was available on the GPU’s internal pathways, offering a high reliability mode from GPU to DRAM and back again. Vega 10, on the other hand only offers the same 1/16th FP64 rate found on all other recent AMD GPUs, and similarly doesn’t have internal ECC. Vega 10 does do better than Fiji in one regard though, and that’s that it has “free” ECC, since the feature is built into the HBM2 memory that AMD uses. So while it doesn’t offer end-to-end ECC, it does offer it within the more volatile memory. Which for AMD’s consumer, professional, and deep learning needs, is satisfactory."
ddriver - Thursday, December 14, 2017 - link
I am still rocking a set of w8100 - there is simply no replacement for them. They currently go at about 900$ for 2.1 tflops, which is the best value product that has been available for a few years.Although if the titan v lives up to the hype I will be tempted to replace them with a single card, value is almost identical compute wise, but I will get a nice bump in compute and especially in graphics performance and serious power efficiency improvement. Quite frankly the only thing still holding be off is the lack of finalized opencl 2.x support.
PaulStoffregen - Thursday, December 14, 2017 - link
Video editors using Final Cut Pro X are almost certainly a huge part of the target audience.Apple has so far given only 1 live demo of this new machine, at the FCPX Creative Summit last October. Among the demos was the upcoming 10.4 version of FCPX editing unrendered 8K video.
tipoo - Wednesday, December 13, 2017 - link
It already only runs at 83% of a slot Vega 64s core speed and memory bandwidth, so hopefully it at least never throttles from there.Flunk - Wednesday, December 13, 2017 - link
I guess this kills the chances of an updated Mac Pro that actually allows upgrades then? How long until the garbage can is discontinued?samsonjs - Wednesday, December 13, 2017 - link
They’re making another Mac Pro. Announced in March, shipping in 2018 or 2019.samsonjs - Wednesday, December 13, 2017 - link
According to Geekbench results the 10-core CPU model is W-2150B ... custom part with lower TDP?fred666 - Wednesday, December 13, 2017 - link
All in one is the wrong form factor for professionals. It would have been much better as a mid-size box.iwod - Wednesday, December 13, 2017 - link
Do all developers and designer use 27"?Is there any reason why they cant fit the dual heatsink system in a 21" iMac?
nerd1 - Wednesday, December 13, 2017 - link
I used to use triple 24" monitors and now using one 27" and one 34" ultrawide.cfenton - Wednesday, December 13, 2017 - link
I would think at least 27". I imagine most use as big as they can reasonably fit in their working space. More screen space you can see clearly without squinting is always valuable.Zdigital2017 - Wednesday, December 13, 2017 - link
None of our developers use less than two or three monitors in the 27"+ category. Between Xcode, Terminal.app sessions, Console.app, the (Watch/iPhone/iPad) Simulator, an e-mail client, web browser and a Git/SVN client, a single display just doesn't cut it, not even close.TEAMSWITCHER - Wednesday, December 13, 2017 - link
5K at 27" is the sweet spot for a productivity focused machine like this. I have a 5K iMac at work and absolutely love it! Retina displays of this size are an absolute joy!Old_Fogie_Late_Bloomer - Wednesday, December 13, 2017 - link
Once you go 27" with a(n effective) resolution of 2560x1440, you won't want to go back. Do I prefer 16x10 rectangles aesthetically? Yes, but that ship has sailed.hbsource - Wednesday, December 13, 2017 - link
Not really. I use the 30" Dell at 2560 x 1600. My secondary monitor is 2560 x 1440 and it's definitely inferior.The 38" LG is next on the shopping list. It's ultra wide but retains the 1600 vertical.
SkiBum1207 - Wednesday, December 13, 2017 - link
If you walk into Facebook, Google, Microsoft (sans Apple products below), or Dropbox - 99% of all developers have pretty much the following setup:Laptop: usually a MBP or occasionally a Thinkpad
27in display: used to be Cinema Display, now the Dell UltraSharp 27 InfinityEdge's are becoming popular
27 is a nice size, economical (not an obscure size to contend with and Intel GPU's can handle 2560x1440), and a sizable bump up from 24in.
I can't speak to designers needs, but I'd imagine that unless the designers are doing super color specific work (photos), much of the modern design work could be done on the above UltraSharps (vector art, web media, etc.)
Source: I used to be in the consulting industry and regularly worked with those companies.
tipoo - Wednesday, December 13, 2017 - link
The 21 4K with a dedicated GPU has the dual heatsinks, just not the dual fans.HStewart - Wednesday, December 13, 2017 - link
It nice to see the Mac Pro using higher performance Intel CPUs but for this type of computer it really needs Quadra GPU's for graphics needs.My last Mac was 2011 MacBook AIR - I also have 2010 MacBook Pro with NVidia GPU.
It is interesting these have Xeon W and not the desktop chips - it be interesting to see performance difference - my guess Xeon W has better IO.
peevee - Wednesday, December 13, 2017 - link
Beautiful inside, just as Steve Jobs ordered.Memory seems to be user-upgradable.
tipoo - Wednesday, December 13, 2017 - link
No RAM door like the regular 27, but it does seem to be socketed.tipoo - Wednesday, December 13, 2017 - link
I'd really love to see Anandtech tear into this. Giving the X to youtubers first was meh, giving the iMac Pro to them first is a bit silly lol.asendra - Thursday, December 14, 2017 - link
It may seem silly, but those Youtubers that got the iMacs record their videos at 8K and use regularly Mac Pros sooo, not so silly?ddrіver - Friday, December 15, 2017 - link
Those Youtubers bring more customers and promotion for the product than AT will anytime soon. AT is almost exclusively targeted at US with basically 0 presence in Europe or Asia, while vloggers have a worldwide reach.And the vast majority of AT traffic comes from searches, people looking for a review long after the novelty of the product is gone. If promotion is what you're after a website like AT is the last place to do it.
tipoo - Wednesday, December 13, 2017 - link
The breakout slide from the 'stuff we didn't have time to show' thing they do said dual SSD modules, wonder what that's about. Just for the 4TB, 2+2? RAID or simple spillover?lilmoe - Wednesday, December 13, 2017 - link
Yay for disposable workstations!lmcd - Wednesday, December 13, 2017 - link
If the A-series SoC is confirmed, it could be in order to drive a potential proprietary keyboard with a similar touch sensor top row like the MacBook Pro (forgetting the name of it but that thing).nedjinski - Thursday, December 14, 2017 - link
at over $13,000 I think I would be inclined to look elsewhere - but maybe it can cook, clean, and do my yard work too? otherwise it's just more overpriced apple shenanigans that, of course, is cool now but will soon be relegated to the designer boneyard just like all their previous miracles.Glaurung - Thursday, December 14, 2017 - link
"otherwise it's just more overpriced apple shenanigans"$5000 seems pretty reasonable considering what's inside.
The components of the base system add up to $4,300, and that's without the AIO case, the wireless keyboard and mouse, the OS, or any warranty:
8 core Xeon W, $1100.
Xeon system Board, $550
1 TB PCIe SSD, $625.
Radeon Vega 56: $400
32gb ecc ram: $440
5k 27" monitor: $1200
Glaurung - Thursday, December 14, 2017 - link
As usual, Apple adds some hefty margins for ram, CPU, and GPU upgrades, but the storage upgrade options are really quite reasonable.name99 - Thursday, December 14, 2017 - link
"While it looks like Apple is going to use standard memory modules, the iMac Pro does not seem to be user-upgradeable, unlike regular iMacs."This is not completely true [use of "standard" memory modules]. People inside Apple have claimed that
- many DIMMs on the market are somewhat out of spec
- this matters when you are trying your utmost to run the machine at lowest power and lowest fan noise, because you rely on the DIMM specs to calibrate how low you can go.
Assuming this to be true, it's possible that they have a reason (you decide whether it's a "good" reason or not) for not allowing in 3rd party DIMMs simply because to do so would be to require the machines to burn a few percent more power and/or to run the fan a few percent faster to cover the unknown quality of those DIMMs.
Now this won't calm down the crowd that insist everything does is a conspiracy, but may clarify the issue for the more rational readers.
I can't attest to the full truth of this, but I can say that in my experience
- one laptop to which I added third party RAM, I then needed to add a menuling that drove the fan faster than the OS wanted to drive it, because otherwise the machine was on the edge of crashing from overheating if I ran an extended period of heavy duty computation
- when I added what was supposedly 1600MHz DDR3 to my iMac 27", the mac downgraded the previous 1600MHz memory speed to 1333.
Both of these suggest some truth to Apple workers' claims about the dodginess of 3rd party RAM.
(And yes, I would be the first to agree that having a hardware team that is working so hard to ensure that the HW is bulletproof at the same time that the SW team is doing their best to ensure that the OS and graphics stacks crash at least once a day [certainly on older macs] is very depressing. All true, but orthogonal to issues of what the hardware is and why it is that way.)
name99 - Thursday, December 14, 2017 - link
"The company says that Mac Pro’s cooling system can cope with up to 500 W of heat, so it cannot use a 140 W CPU and a 295 W GPU in order to avoid overheating."OK, to add to the above. One problem Apple has (again, not trying to make excuses here, trying to explain the situation) is that Apple gets the reputation failure when sub-components go bad, not the sub-component manufacturer. So you can ask why does Apple not provide a 600W cooling system and run the GPU at full power?
Apple would not say so in public, but they may well believe that AMD's claims that it is OK to run the GPU at 300W are simply BS. The graphics card in a large number of iMacs of around the 2007 vintage went bad after about five years because of the heat they produced. AMD may be more or less correct in saying that the part can run for 5 years at 300W, while Apple can ALSO be more or less correct in feeling that, regardless of warranty issues, they want people to feel that Macs (especially EXPENSIVE macs) last as long as they are used, and it's bad for the brand reputation if there are a large number of stories in five or six years about how so many iMac Pros are dying because their graphics cards are going bad.
SaolDan - Friday, December 15, 2017 - link
(no word on the controller or its deve11111111111111111110loper)davide445 - Friday, December 15, 2017 - link
I didn't understand exactly why anyone else than someone using only Apple sw will need this thing.Here a confirmation with double the power, high quality components and same price
https://pcpartpicker.com/user/davide445/saved/fPTN...
stevielee - Friday, December 15, 2017 - link
davide445:Great alternative iMac Pro setup you chose there. But your particular configuration is way more powerful (CPU & GPU) than the entry level 5K+ iMac Pro. Your setup would probably match (sans RAM total & SSD size), the top of the product line 10K+ iMac Pro. And with both the iMac Pro's CPU & GPU being either custom LP parts, as in: B versions of the W-Xeon's, along with underclocking the custom AMD GPU to satisfy Apple's 500w power envelope - even the best iMac Pro that you can buy at up to 15K+ (with tax and a obligatory Applecare warranty), your Nvidia Titan still will easily outperform the iMac's downclocked Vega 64 by a relatively large margin.
And the best thing about your pcpartpicker rig is that almost all of the parts you selected can be easily upgraded as your need and/or financial means change in the next couple of years. The iMac Pro is basically: what you purchase, you are forever stuck with - regardless of how much money you might have paid, or what your future equipment requirements may be.
Apple has done it again: cramming and throttling and soldering all of the internals to fit into a sealed (non-user upgradable) package that they will probably have to do another mea culpa all over again in 3 years time to try and explain to everyone who bought the now obsolete AIO's that they just didn't forsee the hardware issues that arose from insisting that everything in Apple's ecosystem conform to their extreme anorexic design regiment.
ddrіver - Friday, December 15, 2017 - link
What's the point of your exercise? Buying parts and assembling a system will always be cheaper than buying an OEM one. Your calculation is junk anyway, I can buy it from Ebay for less than $4K ;).Do you think a company or anybody buying 5 or 10 or more of these iMacs would ever bother assembling a system by hand? Turn on your brains before you post. This BS idea that you can build it yourself and it's cheaper than the preconfigured system needs to die already. Laptops or servers fall into the same category. Maybe you're going to start wondering why companies buy HP or Dell or Cisco servers instead of assembling them with parts from Newegg,
ddrіver - Friday, December 15, 2017 - link
And BTW, if you look at offers from HP or other OEMs you'll see the same kind of prices: they start at ~$2500 for 4 core Xeons, 16GB of RAM and 512GB SSD. And that doesn't include the 5K display and the clean design.Whiners gonna whine.
stevielee - Friday, December 15, 2017 - link
And trollers gotta troll.YOU are not the last word in everything tech, or what "most" other do, or want, or buy: Apple, or otherwise.
And I an unanimous in that!
Focher - Friday, December 15, 2017 - link
Sorry, but nope. You already show your hand by comparing a Xeon with an AMD CPU. I guess you just look at the number of cores and GHz to compare. That doesn't cut it. And memory bandwidth is much lower. And SSD bandwidth is lower.If you don't understand what the iMac Pro was designed for, and what it can do, you'll just continue to post such ridiculous comparisons.