Last year Lucidlogix came to us with a rather amazing claim: we can do multi-GPU better than the guys who make the video cards in the first place. Through their Hydra technology, Lucid could intercept OpenGL and DirectX API calls, redistribute objects to multiple video cards, and then combine the results into a single video frame. This could be done with dissimilar cards from the same company, even different companies altogether. It would be multi-GPU rendering, but not as you currently know it.
That was in August of 2008, when the company was first showcasing its technologies in hopes of finding a suitor. In 2009 they found that suitor in MSI, who are anchoring their new high-end Big Bang line of motherboards with the Hydra. After some bumps along the way, Lucid and MSI are finally ready to launch the first Hydra-equipped board: The Big Bang Fuzion.
We’ve had the Fuzion in our hands for over a month now, as the hardware has been ready well ahead of the software. Lucid has been continuing to develop the software side, and the two parties are finally ready to sign off on the finished product, although Hydra is still very much a work in progress.
The Big Bang Trinergy, the Fuzion's identical twin
As we’re currently in Las Vegas for CES (where MSI is launching the Fuzion), today we’ll be taking a quick look at the performance and compatibility of the Hydra, to answer the most burning of questions about the technology. Once we’re back from CES, we will be following that up with an in-depth look at image quality, edge cases, and other deeper issues. We’ve only had the newest drivers for a few days now, so we haven’t had a chance to give it a complete workover.
Finally, this is just a look at the Hydra technology itself. We’ll have a separate review of the Fuzion board as a motherboard at a later time. However it’s virtually identical to MSI’s other Big Bang board, the NVIDIA NF200-equipped Trinergy. The only significant difference between the boards is that the Fuzion has the Hydra chip, while the Trinergy has the NF200.
With that out of the way, let’s get started.
47 Comments
View All Comments
cesthree - Friday, January 8, 2010 - link
Multi GPU gaming already suffers from drivers that suck. You want the < 3% who actually run multi GPU's to throw HYDRA driver issues into the mix? That doesn't sound appealing, at all, even if I had thousands to throw at the hardware.Fastest Single GPU. Nuff Said.
Although if Lucid can do this, then maybe ATI and Nvidia will get off their dead-bums and fix their drivers already.
Makaveli - Thursday, January 7, 2010 - link
The major fail is most of the post on this article, its very early silicon with beta drives. And most of you expect it to be beating Xfire and Sli by 30%. When the big guys have had years to tune their drives and they own the hardware. I would like to see where this by next christmas before I pass judgement. Just because you don't see it in front of your face doesn't mean the potential isn't there.Sometimes alittle faith will go along way.
prophet001 - Friday, January 8, 2010 - link
i agreeHardin - Thursday, January 7, 2010 - link
It's a shame the results don't look as promising as we had hoped. Maybe it's just early drivers issues. But it looks like it's too expensive and it's not any better than crossfire as it is. It doesn't even have dx 11 support yet and who knows when they will add it.Jovec - Thursday, January 7, 2010 - link
With these numbers, I wonder why they allowed them to be posted. They had to know they were getting much worse results with their chips than XF, and the negative publicity isn't going to do them any good. I suppose they didn't want to have another backroom showing, but that doesn't mean they should show at this stage.jnmfox - Thursday, January 7, 2010 - link
As has been stated the technology is unimpressive, hopefully they can get things fixed. I am just happy to see one of the best RTS ever made in the benchmarks again. CoH should always be part of anandtech's reviews, then I wouldn't need to go to other sites for video card reviews :P.IKeelU - Thursday, January 7, 2010 - link
I was actually hoping AMD would buy this tech and integrate it into their cards/chipsets. Or maybe Intel. As it stands, we have a small company, developing a supposedly GPU-agnostic "graphics helper" that is attempting to supplant what the big players are already doing with proprietary tech. They need support from mobo manufacturers and cooperation from GPU vendors (who have little incentive to help at the moment due to the desire to lock-in users to proprietary stuff). I really, really, want the Hydra to be a success, but the situation is a recipe for failure.nafhan - Friday, January 8, 2010 - link
That's the same thing I was thinking through the whole article. The market they are going after is small, very demanding, and completely dependent on other companies. The tech is good, but I have a hard time believing they will ever have the resources to implement it properly. Best case scenario (IMO): AMD buys them once they go bankrupt in a year or so, keeps all the engineers, and integrates the tech into their enthusiast NB/SB.krneki457 - Thursday, January 7, 2010 - link
Anand couldn't you use a gtx295 to get approximate gtx280 SLI figures? I read that Hydra doesn't work with dual GPU cards, but couldn't you disable Hydra? You mentioned in the article, that this is possible.As for technology itself, like a lot of comments already mentioned, I really don't see much use in it. Even if it worked properly it would have been more at home in low to mid range motherboards.
Ryan Smith - Thursday, January 7, 2010 - link
I'm going to be seriously looking at using hacked drivers to get SLI results. There are a few ways to add SLI to boards that don't officially support it.It's not the most scientific thing, but it may work to bend the rules this once.