"with different workload complexities depending on wether you're running it on desktop platforms, or mobile devices" Pretty much assures an apples to oranges comparison. The benchmark should be identical regardless of platform.
You are looking at hardware with a complete order of magnitude power budget difference, however. What aligns really well is completely useless for the other, and something in the middle serves neither.
Right. I don't see much benefit in comparing a laptop and a desktop in this manner. You already know the laptop is going to perform worse. What is much more useful is how one laptop compares to another or how one desktop GPU compares with another.
Many people have the opinion that mobile processors/gpus have gotten to the point that they can challenge desktop processors and desktop gpus. Maybe that's the case and maybe it's not - I'd bank on the latter for the same reasons you stated (power budget). The benchmark should clearly tell what's what so people can see the performance differences between desktop and mobile platforms.
Reread the whole paragraph. The more demanding workload is for comparing desktop GPUs against one another. Meanwhile you can set the desktop version to run the exact same workload as mobile. Different APIs can still be a factor, but I'd bet this benchmark is still going to be better than any of the other cross-platform graphics benches out there.
If there are SoCs out there utilizing their higher-end configurations, yes. Personally I can't think of any using their current best now that Apple went in-house.
Not very likely. After going to all the cost and trouble of designing their own GPUs, why on earth would they ever switch back?
If you look at how long AMD has been nursing GCN or how long Intel has been milking their HD Graphics (and why is it both those metaphors involve lactation?), GPU architectures tend to have a pretty long shelf-life. So, if Apple's is any good, they'll probably be using some derivative of it for at least 5 years.
No they're not... though to be fair that's mostly an indication of how bad Intel's iGPUs are for the die space they consume. That may change though in a year or two if they're serious about uh, getting serious.
I maintain that they are - you just haven't been following recent smartphone GPU developments. There's little daylight in the specs of the latest Mali or Adreno and that of Intel's UHD 630.
The irony of having this debate in this very thread is that Basemark GPU *should* make it very easy to settle. Unfortunately, their results browser doesn't let you see results for the same settings between PC and Phone. Maybe Tero @ Basemark can fix this?
I don't believe we can ever have a bench mark that can truly for all device. It hard enough on the PC include PC Mobile platforms - but to include smart phones with smaller screens and ARM processors.
Just take x86 CPU's alone 1. Core difference between vendors and versions can be huge - especially with advance operations like AVX2 and AVX512 2. Differences of how chip is designed - does not mean that same chip frequency equals same power 3. Number of cores - does not always means it faster - it depends on how the core performs.
GPU's have similar types of difference - but a big concern - because the architecture is not base on the same common base as with x86 based cpus.
Then throwing ARM based smartphones and tablets in mix - with totally different designed and purpose messes up tests a lot.
Then there the human factor of the developer writing the tests - are they possibly bias toward a product and not completely taking advantage of a certain product. For example this can be done just by mere factor of trying to make a test so generic across platforms that it does not take full advantage of specific platform.
I see two types of tests out there
1. tests based on generic applications, games and such - which subject to speculation based on each implementation. For example, it will be extremely hard for some one tell me that ARM based CPU can run 3DMax faster than x86 workstation CPU or even x86 Tablet.
2. Test base on sheer computer power of machine. Test like Spec are the close as we get to this - tests using .net or Java are subject to speculation depending on the runtime used. I believe the ideal performance benchmark - would be done in assembly from the manufacture and all designed to specific process as fast as possible.
3. having multiple cores in the mix - actually complicates things a lot - as a software developer for many years - multi-core performance can be difficult thing to measure and implemented because in essence the screen is still single threaded.
Hi, it's Tero @ Basemark. One can run the benchmark with identical complexities and workloads in both desktops and smartphones. The higher complexity is meant to adequately stress a desktop but you can also run the less complex workload with a desktop.
I hope you guys are planning on eventually compiling the Windows version for ARM64. Maybe when you release the DX12 update. It would be nice to make a native DX12-to-DX12 comparison on the same OS, but it only makes sense if you've got native versions for both platforms. Even better if they used the same compiler.
Many people get paid for their good ideas. Instead I get to crack a joke about it, and get criticized by some random hotdog on the internet (free bonus!!). So when you see tests of a Snapdragon running Windows on ARM with a natively compiled ARM64 version of this benchmark, you can feel free to thank me.
Oh, and FYI, it's to THEIR benefit to show up in these threads to DEFEND their product. If you look at the OP he was replying to, it was a negative (and uninformed) post and he was attempting to cancel it out by setting the record straight. They wouldn't do it if there was no benefit to them. The fact that it is mutually beneficial is just a nice bonus.
I like the demo mode - just wish it were longer. I have no problem paying like $5 or so for something like the 3D Mark installments.
BTW, it'd be nice if it said how much space the textures used.
Finally, I don't know if you guys do this (haven't submitted mine, yet), but it might be interesting to try and log temperature data, along with people's benchmark scores.
What kind of scores are people getting for desktop GPUs? There are no results listed on their website, if you can get to it. I got a 1600 for an R9 290 on their standard test, which is significantly lower than their scores for cell phones. Default scores are more than a little misleading. And super buggy on both Windows 10 and Android.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
32 Comments
Back to Article
JackTheBear - Wednesday, June 20, 2018 - link
"with different workload complexities depending on wether you're running it on desktop platforms, or mobile devices"Pretty much assures an apples to oranges comparison. The benchmark should be identical regardless of platform.
Ian Cutress - Wednesday, June 20, 2018 - link
You are looking at hardware with a complete order of magnitude power budget difference, however. What aligns really well is completely useless for the other, and something in the middle serves neither.jordanclock - Wednesday, June 20, 2018 - link
Right. I don't see much benefit in comparing a laptop and a desktop in this manner. You already know the laptop is going to perform worse. What is much more useful is how one laptop compares to another or how one desktop GPU compares with another.JackTheBear - Wednesday, June 20, 2018 - link
Many people have the opinion that mobile processors/gpus have gotten to the point that they can challenge desktop processors and desktop gpus. Maybe that's the case and maybe it's not - I'd bank on the latter for the same reasons you stated (power budget). The benchmark should clearly tell what's what so people can see the performance differences between desktop and mobile platforms.Alexvrb - Wednesday, June 20, 2018 - link
Reread the whole paragraph. The more demanding workload is for comparing desktop GPUs against one another. Meanwhile you can set the desktop version to run the exact same workload as mobile. Different APIs can still be a factor, but I'd bet this benchmark is still going to be better than any of the other cross-platform graphics benches out there.lucam - Thursday, June 21, 2018 - link
Do you think PowerVr will rock this bench?Alexvrb - Thursday, June 21, 2018 - link
If there are SoCs out there utilizing their higher-end configurations, yes. Personally I can't think of any using their current best now that Apple went in-house.lucam - Friday, June 22, 2018 - link
Very true and sad...I hope Apple would still consider PowerVR in the future tho..mode_13h - Sunday, June 24, 2018 - link
Not very likely. After going to all the cost and trouble of designing their own GPUs, why on earth would they ever switch back?If you look at how long AMD has been nursing GCN or how long Intel has been milking their HD Graphics (and why is it both those metaphors involve lactation?), GPU architectures tend to have a pretty long shelf-life. So, if Apple's is any good, they'll probably be using some derivative of it for at least 5 years.
mode_13h - Thursday, June 21, 2018 - link
Eh, high-end mobile GPUs and Intel iGPUs are actually not so far apart.Alexvrb - Thursday, June 21, 2018 - link
No they're not... though to be fair that's mostly an indication of how bad Intel's iGPUs are for the die space they consume. That may change though in a year or two if they're serious about uh, getting serious.mode_13h - Friday, June 22, 2018 - link
I maintain that they are - you just haven't been following recent smartphone GPU developments. There's little daylight in the specs of the latest Mali or Adreno and that of Intel's UHD 630.The irony of having this debate in this very thread is that Basemark GPU *should* make it very easy to settle. Unfortunately, their results browser doesn't let you see results for the same settings between PC and Phone. Maybe Tero @ Basemark can fix this?
https://powerboard.basemark.com/home
Alexvrb - Monday, June 25, 2018 - link
You said "They are actually not so far apart"I said "No, they're not."
As in, I agree with you. They're NOT (so far apart).
mode_13h - Tuesday, June 26, 2018 - link
Cool, thanks.I *do* hope we see improvements to their comparison tool, soon.
FreckledTrout - Wednesday, June 20, 2018 - link
I could see a place for both but really if you are going to deviate by platform you must keep a common benchmark that runs across platforms.HStewart - Wednesday, June 20, 2018 - link
I don't believe we can ever have a bench mark that can truly for all device. It hard enough on the PC include PC Mobile platforms - but to include smart phones with smaller screens and ARM processors.Just take x86 CPU's alone
1. Core difference between vendors and versions can be huge - especially with advance operations like AVX2 and AVX512
2. Differences of how chip is designed - does not mean that same chip frequency equals same power
3. Number of cores - does not always means it faster - it depends on how the core performs.
GPU's have similar types of difference - but a big concern - because the architecture is not base on the same common base as with x86 based cpus.
Then throwing ARM based smartphones and tablets in mix - with totally different designed and purpose messes up tests a lot.
Then there the human factor of the developer writing the tests - are they possibly bias toward a product and not completely taking advantage of a certain product. For example this can be done just by mere factor of trying to make a test so generic across platforms that it does not take full advantage of specific platform.
I see two types of tests out there
1. tests based on generic applications, games and such - which subject to speculation based on each implementation. For example, it will be extremely hard for some one tell me that ARM based CPU can run 3DMax faster than x86 workstation CPU or even x86 Tablet.
2. Test base on sheer computer power of machine. Test like Spec are the close as we get to this - tests using .net or Java are subject to speculation depending on the runtime used. I believe the ideal performance benchmark - would be done in assembly from the manufacture and all designed to specific process as fast as possible.
3. having multiple cores in the mix - actually complicates things a lot - as a software developer for many years - multi-core performance can be difficult thing to measure and implemented because in essence the screen is still single threaded.
Tero @ Basemark - Wednesday, June 20, 2018 - link
Hi, it's Tero @ Basemark. One can run the benchmark with identical complexities and workloads in both desktops and smartphones. The higher complexity is meant to adequately stress a desktop but you can also run the less complex workload with a desktop.Alexvrb - Wednesday, June 20, 2018 - link
I hope you guys are planning on eventually compiling the Windows version for ARM64. Maybe when you release the DX12 update. It would be nice to make a native DX12-to-DX12 comparison on the same OS, but it only makes sense if you've got native versions for both platforms. Even better if they used the same compiler.Tero @ Basemark - Thursday, June 21, 2018 - link
Thanks, that's a good idea, our product team is now reviewing it for next releaseAlexvrb - Thursday, June 21, 2018 - link
…I accept cash, paypal, and graphics cards. :Dmode_13h - Friday, June 22, 2018 - link
For a (mostly) *free* benchmark? Wow...We're lucky even to get this, not to mention having them add the features you want. And you expect payment *on top* of THAT?
mode_13h - Friday, June 22, 2018 - link
P.S. I know you were joking, but I just think it's in poor taste.I appreciate merely having them chat us in this thread.
Alexvrb - Monday, June 25, 2018 - link
Many people get paid for their good ideas. Instead I get to crack a joke about it, and get criticized by some random hotdog on the internet (free bonus!!). So when you see tests of a Snapdragon running Windows on ARM with a natively compiled ARM64 version of this benchmark, you can feel free to thank me.Oh, and FYI, it's to THEIR benefit to show up in these threads to DEFEND their product. If you look at the OP he was replying to, it was a negative (and uninformed) post and he was attempting to cancel it out by setting the record straight. They wouldn't do it if there was no benefit to them. The fact that it is mutually beneficial is just a nice bonus.
mode_13h - Tuesday, June 26, 2018 - link
Eh, for the record I'm not a hotdog. Just a dog.mode_13h - Thursday, June 21, 2018 - link
Thanks for the cool benchmark!I like the demo mode - just wish it were longer. I have no problem paying like $5 or so for something like the 3D Mark installments.
BTW, it'd be nice if it said how much space the textures used.
Finally, I don't know if you guys do this (haven't submitted mine, yet), but it might be interesting to try and log temperature data, along with people's benchmark scores.
Tero @ Basemark - Thursday, June 21, 2018 - link
Thanks! :) I have forwarded your feedback to the product team and they will consider adding these for the next release.mode_13h - Friday, June 22, 2018 - link
I note that you guys are Finnish. Not that it should matter, but having grown up on PC demos, I'm always glad to see its legacy.B3an - Wednesday, June 20, 2018 - link
This app wrongly thinks my Snapdragon 845 S9+ is using a Exynos SoC, and that i have a EU phone, when i have a HK import.Tero @ Basemark - Thursday, June 21, 2018 - link
Thanks for informing us, we're investigating thisTero @ Basemark - Thursday, June 21, 2018 - link
Is your phone Samsung G9650?JackTheBear - Thursday, June 21, 2018 - link
What kind of scores are people getting for desktop GPUs? There are no results listed on their website, if you can get to it. I got a 1600 for an R9 290 on their standard test, which is significantly lower than their scores for cell phones. Default scores are more than a little misleading. And super buggy on both Windows 10 and Android.El Sama - Thursday, June 21, 2018 - link
Rumors circulating is that this benchmark heavily benefits Nvidia, maybe they optimized first or Nvidia sponsored much basemark??