with regards to Thunderbolt, could you please do an article on external GPUs and the reasons why this technology is not taking off? (for example because of intel not licensing the working solutions from silverstone)
There's a whole lot of OSI layers there and bridging them costs a whole lot of money and costs a whole lot of performance. No one wants to spend the money to do it proper because the business people tell them it will lose the company money.
no. that's total bullshit. i have it working here without any issues! GTX780 in Sonnet Echo Express III-D i still get ~90% performance... the sad part of the story is that intel has limited certification resources and prioritizes expensive manufacturers... from a chinese vendor targeting consumers the cost would be 1/10. But that's just not what intel wants.
Working in one situation doesn't mean it will work everywhere. Have you tried:
1. Having the external GPU as part of a Thunderbolt chain? 2. Hot plug tests with frequent connect and disconnect of the GPU enclosure? 3. Ensuring the GPU enclosure works when the PC comes out of different power states?
Certification means that the TB device continues to operate (or fails gracefully) under these circumstances. The initial promise of TB was that it would work great under all these circumstances (and TB-certified peripherals do so). It wouldn't be fair for Intel to backtrack on those promises now. I am not defending Intel here - after all, driving up adoption would be ultimately beneficial to them. We have to look at the hard reality before pinning the blame completely on them.
I was pretty upset with Intel initially too when building the TB Windows testbed and discovering that I couldn't hang the AIC off the CPU's PCIe lanes for the best performance. After talking at IDF, I can see why they did that. The good thing is that these issues are getting resolved one by one. Eventually, we will see external GPU solutions pass certification too.
100% agree, Bernstein. Intel is more interested in power plays with their licensing schemes than bringing useful products to bear - which is exactly why the industry has chosen to stick with USB.
USB doesn't displace thunderbolt in any world. In a perfect world they coexist with your external storage and peripherals talking over USB and your fancy monitors, networking, and external GPUs over thunderbolt. They talk to the CPU differently, they cost differently, and their throughput is different.
Maybe because pretty much nobody really wants external GPUs? Let's see: 1. Significantly more expensive. 2. Slower than actual PCIe. 3. The main use case (gaming on a laptop) would still be a sub-par experience, unless you add other external peripherals, like monitor, mouse and keyboard. And at that point you end up with a messy desk full of cables, throwing the simplicity of a laptop out of the window. And that's assuming external TB GPUs would just work seamlessly, which is not a given at this point either. I would never recommend something like this over a standard desktop PC for a hardcore gamer. For everyone else, the internal laptop GPUs are good enough anyway.
Alright, call me a niche case but I'd love to have a moderate performing external GPU for my desktop *and* laptop. I don't need the heat, noise and high electricity use of a GPU except when I game - yes, casually - but would still like to use decent settings when I do so. iGPUs are getting better but dGPU over TB would be great ... if only the ports could be added with a standard PCIe card like, oh, every other interface known to humankind.
I would like to add one very obvious case: = Professionnal cards on Laptops. Take a Lenovo W5xx, a usefull laptop with a puny Quadro K2000 in it. It's fine for traveling around, doing solidworks presentation to clients, but lacks the punch of a good Quadro K4000. With an eGPU, you can stick a 1000$ card on the desk, and get the fluidity you need to develop. Those cards will get all the bandwith it requires from a TB cable. Maybe it's too much of a niche, but we would certainly get a couple of those in my office!
I see that as the big draw. If I want to do some serious computation for work, I want a high powered card. But what if I also need to travel? To me, external GPUs with a lightweight laptop would solve this problem for professionals (who would also be willing to put out the money for it). I would foresee that having three GPUs could be a problem (I don't think Intel GPUs can run solidworks but I could be wrong). Of course, I also want intel to have a "pro" designation for Solidworks so that power consumption is reduced and it becomes a two GPU solution.
Thank you for this article! Yes please - I would love it if someone with access to Intel asked them the hard questions RE Thunderbolt and graphics cards. If there were Thunderbolt external GPU options, there be much higher adoption of Thunderbolt on PC (think of all of the laptop gamer folks). But Intel wants to hurt nVidia / AMD, so this is all dead in the water just because Intel wants to fight the GPU manufacturers and not serve customers, right? Also, why are there so many single-Thunderbolt port devices? It's a serious pain that things like sound cards (eg MOTU's new Thunderbolt line) cannot be daisy-chained. If Intel cares about quality consumer experience and wants to restrict Thunderbolt licensing for this reason (eg their excuse for not allowing GPUs) why do they license devices that you can't daisy-chain?
Can you give me an example of a device that can't be daisy-chained? AFAIK, every TB certified device can be a part of a daisy chain. What certification doesn't guarantee is whether the device can be anywhere in the chain. For example, a storage device with only one TB port can be part of a daisy chain if it is connected to a monitor's TB daisy chain port instead of the computer's port directly.
No. Unless, Intel integrates Thunderbolt into its PCHs or CPUs it will go the way of the Firewire. It is currently a very expensive niche product. Unless Intel is willing to get the cost to be much more consumer friendly, it won't ever hit the mainstream.
Thunderbolt will never be mainstream as USB 3.0/3.1 already solves any potential issues the average person has with peripheral connectivity. Intel is currently battling whether Thunderbolt will be relevant to anyone but a Mac user, and if that becomes the case, it will remain only in high-priced peripherals for Mac users which the much larger PC market will simply ignore. And for Intel, that means fewer overall Thunderbolt chips sold and yet another boondoggle similar to FireWire.
Two techs largely controlled by Apple. If intel wanted TB to be mainstream it should have kept Apple out of the fold. The second thing they should have not allowed their partner (Apple) to name another product almost the same thing. Most non-techies don't even know what TB is, however, even amongst those who do I often see and hear people confuse aspects of TB with the Lightning connector. Why can you call something with USB 2.0 speeds "Lightning" and get away with it?
Does TB have good prospects? Yes I would love to get rid of the mess of cables on my desk. But intel needs to hand it out more freely or it's not going anywhere. Then they need to call it something that sounds like a standard and not a women's roller derby team mascot. Eventually someone else will one up PCI over DP cabling (both semi-open standards) and TB will become another relic like Betamax.
My understanding is that Apple went to Intel and asked them to develop a new peripheral to combine displays and peripherals on a single bus. So Intel keeping Apple out of it wasn't feasible. But it is Intel carrying the ball now. So, it is up to them to make it a success.
A Intel made a really silly choice to use a daisy chain topology. This adds to the cost of every device, with the requirement to have two connectors. I think hubs or switches would have been a better choice. Cable pricing, is an issue; some cost reduction would come from greater adoption with economies of scale. And yes, Intel letting others develop controllers would also reduce costs.
Not all TB certified peripherals have two TB ports. It is optional. You can have one TB port and make a device the endpoint of a daisy chain.
As for USB 3.x vs Thunderbolt X - They target different markets. In a way, we can term USB 3.x the poor man's Thunderbolt. Both interfaces can easily co-exist. USB 3.x will obviously have more adoption due to multiple vendors standing behind it.
exactly. at some point intel will either have to integrate it into its pch, make it mandatory for some sort of marketing branding (like it forced its wlan radios on everyone with its centrino branding) or accept it being another high end interface facing extinction... or it might just be able to hang on, since we're now down to pcie & usb for high speed generic interconnects. baseT ethernet has now been overtaken by wlan, so its consumer use will rapidly dwindle...
People nerdy enough to know that their specific computer needs a better GPU to play a game are just a minority of the total, and usually do not play on a laptop. It's too small, it's probably not even worth it if they have at least one decent region not to do it (like hurting nvidia/amd).
What is the status of TB support for Linux distros, BSDs, etc? Similar to what you mentioned for windows because they don't use custom hardware like Apple?
Price is still Thunderbolt's biggest issue. When cables cost $50, docks cost $200 and DAS costs a few hundred, consumers won't be interested. Especially when USB 3.0 already does 5Gbps and 3.1 will hit 10Gbps.
1. It takes 4 PCIe lanes from the chipset, which only has 6 or 8, depending on the motherboard maker's design decisions. This significantly reduces what else can be done with the motherboard. If it is on a socket 1150 board, which is all there is so far, the Haswell architecture, with only 16 PCIe lanes from the CPU, restricts it even more. It's like they're trying to force a mATX spec, which doesn't work for most PC enthusiasts.
2. It's geared toward Intel's integrated graphics, which sucks. Sure, it can be done with an external GPU, with with the restrictions in the previous item, that is even difficult to do.
There are three major things they need to do to ramp up adoption: put 4 more PCIe lanes on the mainstream chipset, make a passthrough for external graphics available, and make it available on Haswell-E. It won't see adoption on PCs without these.
The first issue will be more or less solved in the future - Thunderbolt 3 will be allowed to hang off of an x2 interface, and the PCHs paired with Skylake have more PCIe lanes too.
Why can't Thunderbolt off the CPU be implemented in UEFi? It wouldn't solve the hot swap or wake issues, but it would enhance Thunderbolt's killer applications, external graphics and 40 Gbps networking.
Yes Thunderbolt 3 with 8+ lanes of PCIe 3.0. Plus enough power for top end laptop discrete GPU. Users would accept plugging a desktop GPU separately into the wall.
That would make it worthwhile for PC and Mac users. Mac buyers have to buy Thunderbolt, but I'll bet the overwhelming majority of the Mac Thunderbolt ports lay idle.
1. The majority of Mac Thunderbolt ports are used to attach a second monitor to the Mac. This may be a Thunderbolt, DVI , HDMI or DisplayPort Monitor via adapters. 2. The other group of Mac Users attach a Thunderbolt Dock to the port which gives the Mac extra USB3, Firewire 800, eSATA, Gigabit Ethernet, HDMI, or Audio In & Out ports 3. And the other group of Mac Users use Thunderbolt for connection to RAID storage devices - which can go much faster than USB3 devices can.
Thunderbolt really isn't needed on desktop computers because they have PCIe, but for laptops it should be a standard. USB is an old and ugly standard, with 2.0, 3.0, 3.1, ports and soon a new reversible port....none of which can daisy chain or carry a 4K video signal.
Laptops desperately need something to replace the PCIe and Thunderbolt is perfect. PC makers need to wake up and provide one port on ultra-books and two on high end notebooks, matching Apple's MacBook Air and MacBook Pro. 'Nuf Said.
The counterargument is AMD/VESA's DockPort; multiplexed USB3.0 and DisplayPort. Cheaper to implement, free of Intel's licensing restrictions, USB3.0 is already supported by Windows, and it's just "good enough" for most uses.
That said, you'll never get an external GPU with DockPort, and you still can't daisy-chain USB devices (though DockPort-enabled devices can conceivably act as USB hubs, and DockPort itself may support daisy-chaining).
I would be interested in Thunderbolt particularly with external GPU, but I imagine it would have more mass appeal when those Thunderbolt connectivity HUBs/Dock, like the Belkin Thunderbolt Express Dock, were priced low enough, preferably sub $100.
You'll need to wait for displayport 1.3 to have enough single cable bandwidth to support that. 1.3 was supposed to've been finalized in 2014q2; but appears to've slipped. Earlier this year, Asus said it would take a year or two from the spec being finalized to working hardware hitting the market. For the moment, I'd suggest not holding your breath.
3.1 could do it. Certainly it has the power for a mainstream laptop; the challenge will be convincing the laptop OEMs to replace their current charging setup with a USB based one. I suspect a lot of initial implementations will still use a separate (probably barrel) plug for power; and only support outbound smartphone/tablet charging levels of power on the 3.1 ports.
Longer term, dunno. Supporting 60/90W in from all ports would add to costs; while only having a single special port for charging would result in consumer confusion. NTM problems with people trying to charge their laptop with a ~10W phone charger, or their full power laptop with a 30W netbook charger because it's the same plug.
Both do have an auxillary data sub-channel capable of carrying USB2. I don't think it ever got widespread adoption because it was an optional part of HDMI so OEMs couldn't count on it being there; I suspect the same, possibly combined with the monitor (GPU?) vendor needing to provide a driver, scuttled implementation on the DP side.
USB3/3.1 would have the same problems; combined with being enough of a bandwidth pig that only the newest versions of the ports have the bandwidth needed to support it at mainstream resolutions. None of them could do it at their headline resolutions; and not offering it there would lead to widespread wailing and moaning from a chattering class much larger than the few of us who have 4/5k displays (@120hz for DP 1.3) that would be impacted.
Intel slides regarding the Alpine Ridge Thunderbolt 3 controller indicate a new connector will be used, and a USB 3.0 signaling mode will be added to the mix along with DP 1.2, HDMI 2.0 and up to 100W power delivery. Looking over the USB 3.1, Power Deliver 2.0, and Type-C Cable and Connector specs, I get the feeling that the Type-C connector will also serve as the next Thunderbolt connector. One port to rule them all...
its sad you guys didn't bring up the Price factor more in the article. When OEM's make anything thunderbolt related, they try and charge apple tax instead of a healthy 30% margin like everything else.
Thunderbolt External SSD? HAH lets add $100 instead of $.89 cents, which is the cost difference from going to usb 3.0 to thunderbolt.
The cost difference between implementing Thunderbolt and other consumer I/O interfaces is a *lot* more than 89¢. That being said, something clearly doesn't jibe with the retail pricing of Thunderbolt devices and the controller pricing Intel claims on ARK, and I'd love for someone to report on that in more depth.
I just don't see Thunderbolt having a future. External video cards have never been officially supported and Intel clearly doesn't want them to be supported. Video output can be handled with DisplayPort or mini-DP (1.2 can push 4K@60Hz), and everything else can be connected through USB. With USB 3.1 having reversible connectors, 10 Gbps throughput, and even charging capabilities, where's the niche for Thunderbolt?
The fact that everything related to TB costs an arm and a leg certainly doesn't help, either.
Thunderbolt solves a very particular problem for one of Intel's better customers. Since Apple decided to entirely stop making PC's with user accessible PCIe slots, they need Thunderbolt for the corner cases where USB alone cannot meet 100% of the expansion needs of their user base. Until Apple changes course or stops using Intel CPUs, Intel will continue to sell tens of millions of Thunderbolt controllers to them.
Thunderbolt, like FireWire before it, is actually quite popular with the professionals who can actually leverage the additional bandwidth. For the time is money set, there's a value proposition to be had with Thunderbolt devices despite sticker prices that appear outrageous to the casual observer.
Yes, but being popular with a small population of "professionals" isn't good enough. Without broader acceptance, Thunderbolt is doomed to die just like Firewire before it as competing protocols catch up with its capabilities while boasting the very important advantages of cost and compatibility with what everyone else is using.
Hell, USB3.x is already going to have a significant chunk of Thunderbolt's bandwidth. Combine that with the ability to supply higher wattage, and there's little reason for people to even consider the wildly expensive and poorly supported Thunderbolt standard. Outside of a small group of professionals, of course.
Actually, it is. FireWire didn't just die because not enough people used it. It was used extensively over the past 15 years, became an essential technology for many market segments, and played a pivotal role in the advancement of certain product categories. In 2011-2012 it was largely displaced by both USB 3.0 and Thunderbolt. There will always be customers for technologies that offer more than the lowest common denominator, and they will generally provide way better margins for the OEMs that cater to them. They will also unceremoniously ditch old standards as soon as something comes along that helps them get their job done faster.
The whole point of Thunderbolt is to offer the fastest form of serial I/O possible in a mass-market PC. Most people will never use it, but then again, the majority of consumers have also never bought a PCIe add-in card or ExpressCard either. Thunderbolt has been offering 5x the bandwidth of USB 3.0 and 2x that of USB 3.1 since 2011. It will double to 40 Gbit/s per port and inherit USB PD 2.0 style power delivery modes with Alpine Ridge which will accompany Skylake. USB 3.1, on the other hand, won't see Intel integration and thus ubiquity until Cannonlake arrives.
USB will keep getting faster, but it will always be the kind of fast that is also saddled with the requirement to remain cheap and ubiquitous. There will always be a way to design a faster interface if you don't care as much about price and compatibility, and there will always be people who are willing to pay for that kind of performance right now. You can't will away high-margin niche markets. It's not like NVIDIA and AMD are going to stop making $1000+ high-end GPUs or workstation cards just because most people don't buy them.
"USB 3.1, on the other hand, won't see Intel integration and thus ubiquity until Cannonlake arrives." Call me curmudgeonly, but I wouldn't call Intel's continued conflict-of-interest based delays on implementing USB technologies an argument /for/ Thunderbolt so much as yet another one against it.
I broadly agree that this just isn't meant to be a mass-market technology though, so decrying it for not being one is not a great argument to make. I do wish they'd start catering to the external GPU market, though.
There is so much potential in TB for creating the PC equivalent of a home stereo setup with dedicated components in separate cases. I'd love to be able to do a mini-ITX build with IGP, then be able to add a dedicated GPU, storage, professional audio, etc. and build a system over time.
I see this as a way for Intel to really leverage its strengths, and drive things like the NUC forward, even allowing daisy-chaining off of AIO monitor+pcs. I just don't understand why they are not driving TB adoption more aggressively to create a modular approach that benefits them.
Thunderbolt is very much needed for those (like me) who want to connect one of the many existing Firewire (soundcard, mixer, daisy-chain-something, etc.) gear to a nowadays laptop that allows for light weight, better screen resolution, all day battery, touch or better all around specs. ///You may now buy a simple TB to Firewire adaptor for US$35
Therefore, the adoption problem resides at both ends of the manufacturing chain; either high up at Intel and Microsoft headquarters and also at the last stage at laptop manufacturers. The former has been extensively explained in these comments (thanks fellows, this or any other TB article simply fails to gather what matters and where are happening). The latter (manufacturers' side) has not been able yet to come out with a lightweight (below 2 kg), powerful CPUs (above 2,5 Mhz) sporting Thunderbolt connectors. The closer I've seen is HP's Zbook 15 (inches), but being 15" does not quite fill the Mobile bill. 2014's Zbook 14 (yet to be released) could finally bring TB to the workstation masses, but there's little hope that it sports a > 2,8 MHz able CPU... not until Skylake brings those higher CPUs at lower temps.
Interestingly, this is ANOTHER chapter where new Laptop sales have been actually discouraged by Intel vs/+ MS quarrels and lack of joint vision. Not only for Audio Pro, Studio, Sound Recording and the Plethora of newcoming DJ markets*1, but add the several commenters here stating our waiting state, relating eGPU, Daisy-chaining of modular devices, need for 3 instant plug monitors, need to reduce the cable clut, etc.
*1, Just add the many millions of sound/audio spectacles being held every weekend worldwide, needing mobile workstations, many of them not relying on Apple but on Firewire/glitchty USB.
To clarify the main point above; In the Studio and PRO Audio world you need to invest in outboard gear (soundcards, dedicated processors, etc), very specialized, that have a long life cycle and that needs to be connected via a low latency connector, wide bandwith. So farFirewire, PCIe, (expresscard) have been the answer and thus, there are a myriad of audio equipment out there already usually bought at high prices that needs to be interfaced. Thunderbolt functions under these conditions and has accessible adaptors suited for the task. ///Mobile powerful laptop solutions with Thunderbolt are the missing link so far.
Don't worry. There will be a newer standard when the need for this becomes high enough. Thunderbolt was a nice try, but still doesn't meet the real needs in the future.
There are definite polar ends to people's like/dislike of Thunderbolt. I for one am looking forward to Thunderbolt's implementation in the Skylake chipset on the Intel platform. One of the reasons I have not moved to Thunderbolt yet on the PC platform is the lack of ability for the Thunderbolt docking stations to support more than 1 monitor (I have 3 1920x1200 monitors). Yes, you can have 2 monitors but one of them must be Thunderbolt. I have checked with two different vendors - Sonnet and Startech - and both have told me that I cannot do a triple monitor setup. I have wondered if a Displayport MST hub would solve the problem, but I doubt it.
Anyway, thank you for keeping up on Thunderbolt, and I look forward to hearing definitively if Thunderbolt 3 will support 3 monitors (I would think it would with 40 Gps available bandwith). The notes from last April said that it would support 2 4K monitors, so I would assume that it would. I'm just sorry that it is a year out until processors using the Skylake chipset will be available. Like others have said, I trust the price premium for Thunderbolt cables will come down, especially Corning's cables - ouch!
A DisplayPort MST hub such as the Startech MSTMDP123DP (which claims Thunderbolt compatibility) should work in conjunction with Thunderbolt 2 and a GPU that supports DP 1.2 MST output, and should have no problem handling 3x 1920x1200 displays. However, OS X support may be entirely lacking at this juncture, if that matters to you.
The other alternative is to use a dual head adapter, such as the Matrox DualHead2Go Digital ME Graphics eXpansion Module, for two of the displays. It presents multiple displays to the OS as a single surface so that only one display output stream is required. And if you really need to do all three of your displays from a single Thunderbolt port, you could daisy chain a Thunderbolt dock with a display output, a second dual-port Thunderbolt device, and then a Matrox DualHead2Go. Although that would be at least $520 worth of hardware by my calculations just to drive three 1920x1200 displays.
A cable standard with virtually no consumer demand because they simply don't care how many separate cables are hooked up onto stationary boxes and no industry support because it reminds them of Intel's RDRAM strongarming tactics back in the late 1990s? Nope, won't fail at all.
Just look at how spectacularly Intel failed to stealth pushing TB to OEMs with the Sandy Bridge non-USB3 chipsets when everyone just give them a big middle finger and went third party USB3.0.
"Just look at how spectacularly Intel failed to stealth pushing TB to OEMs with the Sandy Bridge non-USB3 chipsets when everyone just give them a big middle finger and went third party USB3.0."
Considering Apple negotiated an exclusive on Thunderbolt for the entire Sandy Bridge era, your conspiracy theory is a bit laughable. Basically, if you weren't Apple, your options for SNB platforms were to ship with USB 2.0 only or throw in a discrete USB 3.0 controller which was a very reasonable ~$5 adder. When IVB rolled around, everybody got the most performant USB 3.0 solution available included in the chipset at no additional cost, and also had the option of including Thunderbolt as a brutal ~$20 adder with zero ecosystem of devices approved for use under Windows. And you're telling me that was Intel's big play to push Thunderbolt to OEMs?
Intel was responsible for creating the xHCI spec, and while it annoyed many people that they refused to finalize the xHCI 1.0 spec until the tape out of the 7-series chipsets was completed, it's far more likely that they took their time because shipping 200+M chipsets with a show-stopping bug due to incomplete validation is something they simply couldn't risk. Given the competitive climate, or complete lack thereof, it wasn't like Intel really needed to take much in the way of risks period. Only 70M USB 3.0 enabled devices total shipped in 2011. Nobody had the kind of exposure that Intel did had they attempted to get USB 3.0 integrated in time for Sandy Bridge.
Why are there no screens with integrated PCIe graphics enclosure on the market? I would imagine this to be a great combination, allowing easy docking of laptops with a single cable. For all those who are burdened with laptops, this would be enough to not have them get a desktop as well, for almost all use-cases (given a sufficiently powerful CPU that can keep itself cool) I don't think this would add a significant cost to the already expensive thunderbolt displays, besides a higher spec'ed power supply, and an ugly case on the rear, as well as some circuitry. Of course, as long as Apple is the ambassador of TB, this will not happen.
Thunderbolt is a big deal to Mac because you can't put anything in a Apple computer. It's the only high performance connection available.
Thunderbolt has been out now for 3 years, it's going no where fast in the PC world. Intel would have had to integrate it in to every Z77 and newer chipset to drive adoption.
It is hard to justify connecting your drives using a Thunderbolt enclosure when the internal connections are both free and faster. Thunderbolt has more latency than the built in Sata ports. Intel needs to visit Redmond and smack them around some. But then Intel carries much of the blame themselves. The Thunderbolt chip costs 10 bucks in bulk? I could buy a really nice hardware raid card for what the Thunderbolt enclosure I am going to buy costs, if only I could put a card in my iMac. But I have no choice.
It doesn't matter how cool the hardware is if their is no software to make it work. It doesn't matter how cool the hardware is if people can't afford it.
Right now I am looking at buying a Thunderbolt enclosure for 460 dollars because I love the Apple operating system that much. But I could build most of a PC for that money. And it doesn't come with a single HDD. And the saddest part is that no matter what, the performance will never match simply connecting the SSD I want to use to the Sata port on a motherboard. Thunderbolt doesn't have anything to offer to justify the cost, after three years I think it's obvious that Thunderbolt has already gone the way Firewire has. I find myself asking why 10Gigabit has to be so expensive, because if those interfaces were affordable, a SAN would be a good alternative.
The following in this article is a blatant LIE: "However, the drivers have been updated to enable 'Thunderbolt Networking'. This involves linking multiple PCs / Macs with Thunderbolt cables. A 10 Gbps network is automatically created (in the form of a 'dummy' network adapter)."
This is not true. If you connect 2 PCs with thunderbolt NOTHING HAPPENS! Thunderbolt on PC is only good for certain storage devices. Please do not post lies like this.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
65 Comments
Back to Article
ydeer - Sunday, September 14, 2014 - link
with regards to Thunderbolt, could you please do an article on external GPUs and the reasons why this technology is not taking off? (for example because of intel not licensing the working solutions from silverstone)willis936 - Sunday, September 14, 2014 - link
There's a whole lot of OSI layers there and bridging them costs a whole lot of money and costs a whole lot of performance. No one wants to spend the money to do it proper because the business people tell them it will lose the company money.bernstein - Sunday, September 14, 2014 - link
no. that's total bullshit. i have it working here without any issues! GTX780 in Sonnet Echo Express III-D i still get ~90% performance... the sad part of the story is that intel has limited certification resources and prioritizes expensive manufacturers... from a chinese vendor targeting consumers the cost would be 1/10. But that's just not what intel wants.ganeshts - Sunday, September 14, 2014 - link
Working in one situation doesn't mean it will work everywhere. Have you tried:1. Having the external GPU as part of a Thunderbolt chain?
2. Hot plug tests with frequent connect and disconnect of the GPU enclosure?
3. Ensuring the GPU enclosure works when the PC comes out of different power states?
Certification means that the TB device continues to operate (or fails gracefully) under these circumstances. The initial promise of TB was that it would work great under all these circumstances (and TB-certified peripherals do so). It wouldn't be fair for Intel to backtrack on those promises now. I am not defending Intel here - after all, driving up adoption would be ultimately beneficial to them. We have to look at the hard reality before pinning the blame completely on them.
I was pretty upset with Intel initially too when building the TB Windows testbed and discovering that I couldn't hang the AIC off the CPU's PCIe lanes for the best performance. After talking at IDF, I can see why they did that. The good thing is that these issues are getting resolved one by one. Eventually, we will see external GPU solutions pass certification too.
TerdFerguson - Sunday, September 14, 2014 - link
100% agree, Bernstein. Intel is more interested in power plays with their licensing schemes than bringing useful products to bear - which is exactly why the industry has chosen to stick with USB.willis936 - Sunday, September 14, 2014 - link
USB doesn't displace thunderbolt in any world. In a perfect world they coexist with your external storage and peripherals talking over USB and your fancy monitors, networking, and external GPUs over thunderbolt. They talk to the CPU differently, they cost differently, and their throughput is different.thewhat - Monday, September 15, 2014 - link
Maybe because pretty much nobody really wants external GPUs?Let's see:
1. Significantly more expensive.
2. Slower than actual PCIe.
3. The main use case (gaming on a laptop) would still be a sub-par experience, unless you add other external peripherals, like monitor, mouse and keyboard. And at that point you end up with a messy desk full of cables, throwing the simplicity of a laptop out of the window.
And that's assuming external TB GPUs would just work seamlessly, which is not a given at this point either.
I would never recommend something like this over a standard desktop PC for a hardcore gamer. For everyone else, the internal laptop GPUs are good enough anyway.
YoshoMasaki - Monday, September 15, 2014 - link
Alright, call me a niche case but I'd love to have a moderate performing external GPU for my desktop *and* laptop. I don't need the heat, noise and high electricity use of a GPU except when I game - yes, casually - but would still like to use decent settings when I do so. iGPUs are getting better but dGPU over TB would be great ... if only the ports could be added with a standard PCIe card like, oh, every other interface known to humankind.Relaxe - Monday, September 15, 2014 - link
I would like to add one very obvious case:= Professionnal cards on Laptops.
Take a Lenovo W5xx, a usefull laptop with a puny Quadro K2000 in it. It's fine for traveling around, doing solidworks presentation to clients, but lacks the punch of a good Quadro K4000.
With an eGPU, you can stick a 1000$ card on the desk, and get the fluidity you need to develop. Those cards will get all the bandwith it requires from a TB cable. Maybe it's too much of a niche, but we would certainly get a couple of those in my office!
ingwe - Monday, September 15, 2014 - link
I see that as the big draw. If I want to do some serious computation for work, I want a high powered card. But what if I also need to travel? To me, external GPUs with a lightweight laptop would solve this problem for professionals (who would also be willing to put out the money for it). I would foresee that having three GPUs could be a problem (I don't think Intel GPUs can run solidworks but I could be wrong). Of course, I also want intel to have a "pro" designation for Solidworks so that power consumption is reduced and it becomes a two GPU solution.Bruce Allen - Sunday, September 14, 2014 - link
Thank you for this article! Yes please - I would love it if someone with access to Intel asked them the hard questions RE Thunderbolt and graphics cards. If there were Thunderbolt external GPU options, there be much higher adoption of Thunderbolt on PC (think of all of the laptop gamer folks). But Intel wants to hurt nVidia / AMD, so this is all dead in the water just because Intel wants to fight the GPU manufacturers and not serve customers, right? Also, why are there so many single-Thunderbolt port devices? It's a serious pain that things like sound cards (eg MOTU's new Thunderbolt line) cannot be daisy-chained. If Intel cares about quality consumer experience and wants to restrict Thunderbolt licensing for this reason (eg their excuse for not allowing GPUs) why do they license devices that you can't daisy-chain?ganeshts - Sunday, September 14, 2014 - link
Can you give me an example of a device that can't be daisy-chained? AFAIK, every TB certified device can be a part of a daisy chain. What certification doesn't guarantee is whether the device can be anywhere in the chain. For example, a storage device with only one TB port can be part of a daisy chain if it is connected to a monitor's TB daisy chain port instead of the computer's port directly.danjw - Sunday, September 14, 2014 - link
No. Unless, Intel integrates Thunderbolt into its PCHs or CPUs it will go the way of the Firewire. It is currently a very expensive niche product. Unless Intel is willing to get the cost to be much more consumer friendly, it won't ever hit the mainstream.jeffkibuule - Sunday, September 14, 2014 - link
Thunderbolt will never be mainstream as USB 3.0/3.1 already solves any potential issues the average person has with peripheral connectivity. Intel is currently battling whether Thunderbolt will be relevant to anyone but a Mac user, and if that becomes the case, it will remain only in high-priced peripherals for Mac users which the much larger PC market will simply ignore. And for Intel, that means fewer overall Thunderbolt chips sold and yet another boondoggle similar to FireWire.hpglow - Sunday, September 14, 2014 - link
Two techs largely controlled by Apple. If intel wanted TB to be mainstream it should have kept Apple out of the fold. The second thing they should have not allowed their partner (Apple) to name another product almost the same thing. Most non-techies don't even know what TB is, however, even amongst those who do I often see and hear people confuse aspects of TB with the Lightning connector. Why can you call something with USB 2.0 speeds "Lightning" and get away with it?Does TB have good prospects? Yes I would love to get rid of the mess of cables on my desk. But intel needs to hand it out more freely or it's not going anywhere. Then they need to call it something that sounds like a standard and not a women's roller derby team mascot. Eventually someone else will one up PCI over DP cabling (both semi-open standards) and TB will become another relic like Betamax.
danjw - Sunday, September 14, 2014 - link
My understanding is that Apple went to Intel and asked them to develop a new peripheral to combine displays and peripherals on a single bus. So Intel keeping Apple out of it wasn't feasible. But it is Intel carrying the ball now. So, it is up to them to make it a success.A Intel made a really silly choice to use a daisy chain topology. This adds to the cost of every device, with the requirement to have two connectors. I think hubs or switches would have been a better choice. Cable pricing, is an issue; some cost reduction would come from greater adoption with economies of scale. And yes, Intel letting others develop controllers would also reduce costs.
ganeshts - Sunday, September 14, 2014 - link
Not all TB certified peripherals have two TB ports. It is optional. You can have one TB port and make a device the endpoint of a daisy chain.As for USB 3.x vs Thunderbolt X - They target different markets. In a way, we can term USB 3.x the poor man's Thunderbolt. Both interfaces can easily co-exist. USB 3.x will obviously have more adoption due to multiple vendors standing behind it.
bernstein - Sunday, September 14, 2014 - link
exactly. at some point intel will either have to integrate it into its pch, make it mandatory for some sort of marketing branding (like it forced its wlan radios on everyone with its centrino branding) or accept it being another high end interface facing extinction...or it might just be able to hang on, since we're now down to pcie & usb for high speed generic interconnects. baseT ethernet has now been overtaken by wlan, so its consumer use will rapidly dwindle...
StormyParis - Sunday, September 14, 2014 - link
So Intel don't think... price.. is an issue ?willis936 - Sunday, September 14, 2014 - link
As evidenced by the majority of their SKUs picking up in price where AMD leaves off while still still being the biggest x86 manufacturer.Murloc - Sunday, September 14, 2014 - link
People nerdy enough to know that their specific computer needs a better GPU to play a game are just a minority of the total, and usually do not play on a laptop. It's too small, it's probably not even worth it if they have at least one decent region not to do it (like hurting nvidia/amd).Essence_of_War - Sunday, September 14, 2014 - link
What is the status of TB support for Linux distros, BSDs, etc? Similar to what you mentioned for windows because they don't use custom hardware like Apple?I'm in interested in that too! :)
tuxRoller - Sunday, September 14, 2014 - link
Not ready yet. MG has said as much.MartinT - Sunday, September 14, 2014 - link
"But, we already know it is not going to be the case till Skylake launches."If Skylake goes the way of Broadwell, we won't have to worry if Thunderbolt can hang off the CPU for simple lack of PCIe on there.
r3loaded - Sunday, September 14, 2014 - link
Price is still Thunderbolt's biggest issue. When cables cost $50, docks cost $200 and DAS costs a few hundred, consumers won't be interested. Especially when USB 3.0 already does 5Gbps and 3.1 will hit 10Gbps.dgingeri - Sunday, September 14, 2014 - link
Two big problems with thunderbolt on PCs:1. It takes 4 PCIe lanes from the chipset, which only has 6 or 8, depending on the motherboard maker's design decisions. This significantly reduces what else can be done with the motherboard. If it is on a socket 1150 board, which is all there is so far, the Haswell architecture, with only 16 PCIe lanes from the CPU, restricts it even more. It's like they're trying to force a mATX spec, which doesn't work for most PC enthusiasts.
2. It's geared toward Intel's integrated graphics, which sucks. Sure, it can be done with an external GPU, with with the restrictions in the previous item, that is even difficult to do.
There are three major things they need to do to ramp up adoption: put 4 more PCIe lanes on the mainstream chipset, make a passthrough for external graphics available, and make it available on Haswell-E. It won't see adoption on PCs without these.
SirKnobsworth - Monday, September 15, 2014 - link
The first issue will be more or less solved in the future - Thunderbolt 3 will be allowed to hang off of an x2 interface, and the PCHs paired with Skylake have more PCIe lanes too.The Hardcard - Sunday, September 14, 2014 - link
Why can't Thunderbolt off the CPU be implemented in UEFi? It wouldn't solve the hot swap or wake issues, but it would enhance Thunderbolt's killer applications, external graphics and 40 Gbps networking.Yes Thunderbolt 3 with 8+ lanes of PCIe 3.0. Plus enough power for top end laptop discrete GPU. Users would accept plugging a desktop GPU separately into the wall.
That would make it worthwhile for PC and Mac users. Mac buyers have to buy Thunderbolt, but I'll bet the overwhelming majority of the Mac Thunderbolt ports lay idle.
jameskatt - Sunday, September 14, 2014 - link
1. The majority of Mac Thunderbolt ports are used to attach a second monitor to the Mac. This may be a Thunderbolt, DVI , HDMI or DisplayPort Monitor via adapters.2. The other group of Mac Users attach a Thunderbolt Dock to the port which gives the Mac extra USB3, Firewire 800, eSATA, Gigabit Ethernet, HDMI, or Audio In & Out ports
3. And the other group of Mac Users use Thunderbolt for connection to RAID storage devices - which can go much faster than USB3 devices can.
TEAMSWITCHER - Sunday, September 14, 2014 - link
Thunderbolt really isn't needed on desktop computers because they have PCIe, but for laptops it should be a standard. USB is an old and ugly standard, with 2.0, 3.0, 3.1, ports and soon a new reversible port....none of which can daisy chain or carry a 4K video signal.Laptops desperately need something to replace the PCIe and Thunderbolt is perfect. PC makers need to wake up and provide one port on ultra-books and two on high end notebooks, matching Apple's MacBook Air and MacBook Pro. 'Nuf Said.
londedoganet - Sunday, September 14, 2014 - link
The counterargument is AMD/VESA's DockPort; multiplexed USB3.0 and DisplayPort. Cheaper to implement, free of Intel's licensing restrictions, USB3.0 is already supported by Windows, and it's just "good enough" for most uses.That said, you'll never get an external GPU with DockPort, and you still can't daisy-chain USB devices (though DockPort-enabled devices can conceivably act as USB hubs, and DockPort itself may support daisy-chaining).
Malih - Sunday, September 14, 2014 - link
I would be interested in Thunderbolt particularly with external GPU, but I imagine it would have more mass appeal when those Thunderbolt connectivity HUBs/Dock, like the Belkin Thunderbolt Express Dock, were priced low enough, preferably sub $100.djvita - Sunday, September 14, 2014 - link
wait was there anything on cherrytrail? is it postponed?surt - Sunday, September 14, 2014 - link
When will they support 4k/120p? I'm not planning to buy a new connector until that's supported.DanNeely - Sunday, September 14, 2014 - link
You'll need to wait for displayport 1.3 to have enough single cable bandwidth to support that. 1.3 was supposed to've been finalized in 2014q2; but appears to've slipped. Earlier this year, Asus said it would take a year or two from the spec being finalized to working hardware hitting the market. For the moment, I'd suggest not holding your breath.http://www.tweaktown.com/news/38112/asus-reminds-u...
SirKnobsworth - Monday, September 15, 2014 - link
TB3 will have enough bandwidth for that. Whether it will implement the necessary DP 1.3 is another question.biostud - Sunday, September 14, 2014 - link
Basically it's a product the homeowner doesn't need and it cost an arm and a leg.Pissedoffyouth - Sunday, September 14, 2014 - link
I just want a solution to the following:Have a monitor with all attached to it. Power, keyboard, other USB esata etc.
Let me come home and plug a single cable into my laptop/tablet which will give me my screen, my keyboard mouse USB esata. But also charge my laptop.
And potentially daisy chain a second or 3rd monitor off it too.
USB3.1 kind of fits the bill, doesn't it? Or it that more thundrbolt?
DanNeely - Sunday, September 14, 2014 - link
3.1 could do it. Certainly it has the power for a mainstream laptop; the challenge will be convincing the laptop OEMs to replace their current charging setup with a USB based one. I suspect a lot of initial implementations will still use a separate (probably barrel) plug for power; and only support outbound smartphone/tablet charging levels of power on the 3.1 ports.Longer term, dunno. Supporting 60/90W in from all ports would add to costs; while only having a single special port for charging would result in consumer confusion. NTM problems with people trying to charge their laptop with a ~10W phone charger, or their full power laptop with a 30W netbook charger because it's the same plug.
Pissedoffyouth - Sunday, September 14, 2014 - link
Perhaps they could add power + USB/PCIe to the display port connector or a future hdmi?DanNeely - Monday, September 15, 2014 - link
Both do have an auxillary data sub-channel capable of carrying USB2. I don't think it ever got widespread adoption because it was an optional part of HDMI so OEMs couldn't count on it being there; I suspect the same, possibly combined with the monitor (GPU?) vendor needing to provide a driver, scuttled implementation on the DP side.USB3/3.1 would have the same problems; combined with being enough of a bandwidth pig that only the newest versions of the ports have the bandwidth needed to support it at mainstream resolutions. None of them could do it at their headline resolutions; and not offering it there would lead to widespread wailing and moaning from a chattering class much larger than the few of us who have 4/5k displays (@120hz for DP 1.3) that would be impacted.
repoman27 - Monday, September 15, 2014 - link
Intel slides regarding the Alpine Ridge Thunderbolt 3 controller indicate a new connector will be used, and a USB 3.0 signaling mode will be added to the mix along with DP 1.2, HDMI 2.0 and up to 100W power delivery. Looking over the USB 3.1, Power Deliver 2.0, and Type-C Cable and Connector specs, I get the feeling that the Type-C connector will also serve as the next Thunderbolt connector. One port to rule them all...Zertzable - Sunday, September 14, 2014 - link
Where is Thunderbolt headed? Nowhere. In a few years TB and FireWire will throw a party together.Morawka - Sunday, September 14, 2014 - link
its sad you guys didn't bring up the Price factor more in the article. When OEM's make anything thunderbolt related, they try and charge apple tax instead of a healthy 30% margin like everything else.Thunderbolt External SSD? HAH lets add $100 instead of $.89 cents, which is the cost difference from going to usb 3.0 to thunderbolt.
repoman27 - Monday, September 15, 2014 - link
The cost difference between implementing Thunderbolt and other consumer I/O interfaces is a *lot* more than 89¢. That being said, something clearly doesn't jibe with the retail pricing of Thunderbolt devices and the controller pricing Intel claims on ARK, and I'd love for someone to report on that in more depth.JDG1980 - Sunday, September 14, 2014 - link
I just don't see Thunderbolt having a future. External video cards have never been officially supported and Intel clearly doesn't want them to be supported. Video output can be handled with DisplayPort or mini-DP (1.2 can push 4K@60Hz), and everything else can be connected through USB. With USB 3.1 having reversible connectors, 10 Gbps throughput, and even charging capabilities, where's the niche for Thunderbolt?The fact that everything related to TB costs an arm and a leg certainly doesn't help, either.
repoman27 - Monday, September 15, 2014 - link
Thunderbolt solves a very particular problem for one of Intel's better customers. Since Apple decided to entirely stop making PC's with user accessible PCIe slots, they need Thunderbolt for the corner cases where USB alone cannot meet 100% of the expansion needs of their user base. Until Apple changes course or stops using Intel CPUs, Intel will continue to sell tens of millions of Thunderbolt controllers to them.Thunderbolt, like FireWire before it, is actually quite popular with the professionals who can actually leverage the additional bandwidth. For the time is money set, there's a value proposition to be had with Thunderbolt devices despite sticker prices that appear outrageous to the casual observer.
kyuu - Monday, September 15, 2014 - link
Yes, but being popular with a small population of "professionals" isn't good enough. Without broader acceptance, Thunderbolt is doomed to die just like Firewire before it as competing protocols catch up with its capabilities while boasting the very important advantages of cost and compatibility with what everyone else is using.Hell, USB3.x is already going to have a significant chunk of Thunderbolt's bandwidth. Combine that with the ability to supply higher wattage, and there's little reason for people to even consider the wildly expensive and poorly supported Thunderbolt standard. Outside of a small group of professionals, of course.
repoman27 - Monday, September 15, 2014 - link
Actually, it is. FireWire didn't just die because not enough people used it. It was used extensively over the past 15 years, became an essential technology for many market segments, and played a pivotal role in the advancement of certain product categories. In 2011-2012 it was largely displaced by both USB 3.0 and Thunderbolt. There will always be customers for technologies that offer more than the lowest common denominator, and they will generally provide way better margins for the OEMs that cater to them. They will also unceremoniously ditch old standards as soon as something comes along that helps them get their job done faster.The whole point of Thunderbolt is to offer the fastest form of serial I/O possible in a mass-market PC. Most people will never use it, but then again, the majority of consumers have also never bought a PCIe add-in card or ExpressCard either. Thunderbolt has been offering 5x the bandwidth of USB 3.0 and 2x that of USB 3.1 since 2011. It will double to 40 Gbit/s per port and inherit USB PD 2.0 style power delivery modes with Alpine Ridge which will accompany Skylake. USB 3.1, on the other hand, won't see Intel integration and thus ubiquity until Cannonlake arrives.
USB will keep getting faster, but it will always be the kind of fast that is also saddled with the requirement to remain cheap and ubiquitous. There will always be a way to design a faster interface if you don't care as much about price and compatibility, and there will always be people who are willing to pay for that kind of performance right now. You can't will away high-margin niche markets. It's not like NVIDIA and AMD are going to stop making $1000+ high-end GPUs or workstation cards just because most people don't buy them.
Spunjji - Wednesday, September 17, 2014 - link
"USB 3.1, on the other hand, won't see Intel integration and thus ubiquity until Cannonlake arrives."Call me curmudgeonly, but I wouldn't call Intel's continued conflict-of-interest based delays on implementing USB technologies an argument /for/ Thunderbolt so much as yet another one against it.
I broadly agree that this just isn't meant to be a mass-market technology though, so decrying it for not being one is not a great argument to make. I do wish they'd start catering to the external GPU market, though.
Hlafordlaes - Monday, September 15, 2014 - link
There is so much potential in TB for creating the PC equivalent of a home stereo setup with dedicated components in separate cases. I'd love to be able to do a mini-ITX build with IGP, then be able to add a dedicated GPU, storage, professional audio, etc. and build a system over time.I see this as a way for Intel to really leverage its strengths, and drive things like the NUC forward, even allowing daisy-chaining off of AIO monitor+pcs. I just don't understand why they are not driving TB adoption more aggressively to create a modular approach that benefits them.
AppleCrappleHater2 - Monday, September 15, 2014 - link
Exactly, my thoughts as well.Spunjji - Wednesday, September 17, 2014 - link
This is a very neat idea.Nexing - Friday, September 19, 2014 - link
Thunderbolt is very much needed for those (like me) who want to connect one of the many existing Firewire (soundcard, mixer, daisy-chain-something, etc.) gear to a nowadays laptop that allows for light weight, better screen resolution, all day battery, touch or better all around specs.///You may now buy a simple TB to Firewire adaptor for US$35
Therefore, the adoption problem resides at both ends of the manufacturing chain; either high up at Intel and Microsoft headquarters and also at the last stage at laptop manufacturers.
The former has been extensively explained in these comments (thanks fellows, this or any other TB article simply fails to gather what matters and where are happening). The latter (manufacturers' side) has not been able yet to come out with a lightweight (below 2 kg), powerful CPUs (above 2,5 Mhz) sporting Thunderbolt connectors. The closer I've seen is HP's Zbook 15 (inches), but being 15" does not quite fill the Mobile bill. 2014's Zbook 14 (yet to be released) could finally bring TB to the workstation masses, but there's little hope that it sports a > 2,8 MHz able CPU... not until Skylake brings those higher CPUs at lower temps.
Interestingly, this is ANOTHER chapter where new Laptop sales have been actually discouraged by Intel vs/+ MS quarrels and lack of joint vision.
Not only for Audio Pro, Studio, Sound Recording and the Plethora of newcoming DJ markets*1, but add the several commenters here stating our waiting state, relating eGPU, Daisy-chaining of modular devices, need for 3 instant plug monitors, need to reduce the cable clut, etc.
*1, Just add the many millions of sound/audio spectacles being held every weekend worldwide, needing mobile workstations, many of them not relying on Apple but on Firewire/glitchty USB.
Nexing - Tuesday, September 30, 2014 - link
To clarify the main point above;In the Studio and PRO Audio world you need to invest in outboard gear (soundcards, dedicated processors, etc), very specialized, that have a long life cycle and that needs to be connected via a low latency connector, wide bandwith.
So farFirewire, PCIe, (expresscard) have been the answer and thus, there are a myriad of audio equipment out there already usually bought at high prices that needs to be interfaced.
Thunderbolt functions under these conditions and has accessible adaptors suited for the task.
///Mobile powerful laptop solutions with Thunderbolt are the missing link so far.
sorten - Monday, September 15, 2014 - link
Outside of the external GPU use case, I have zero interest in Thunderbolt.Narg - Monday, September 15, 2014 - link
Don't worry. There will be a newer standard when the need for this becomes high enough. Thunderbolt was a nice try, but still doesn't meet the real needs in the future.Seated - Monday, September 15, 2014 - link
There are definite polar ends to people's like/dislike of Thunderbolt. I for one am looking forward to Thunderbolt's implementation in the Skylake chipset on the Intel platform. One of the reasons I have not moved to Thunderbolt yet on the PC platform is the lack of ability for the Thunderbolt docking stations to support more than 1 monitor (I have 3 1920x1200 monitors). Yes, you can have 2 monitors but one of them must be Thunderbolt. I have checked with two different vendors - Sonnet and Startech - and both have told me that I cannot do a triple monitor setup. I have wondered if a Displayport MST hub would solve the problem, but I doubt it.Anyway, thank you for keeping up on Thunderbolt, and I look forward to hearing definitively if Thunderbolt 3 will support 3 monitors (I would think it would with 40 Gps available bandwith). The notes from last April said that it would support 2 4K monitors, so I would assume that it would. I'm just sorry that it is a year out until processors using the Skylake chipset will be available. Like others have said, I trust the price premium for Thunderbolt cables will come down, especially Corning's cables - ouch!
repoman27 - Tuesday, September 16, 2014 - link
A DisplayPort MST hub such as the Startech MSTMDP123DP (which claims Thunderbolt compatibility) should work in conjunction with Thunderbolt 2 and a GPU that supports DP 1.2 MST output, and should have no problem handling 3x 1920x1200 displays. However, OS X support may be entirely lacking at this juncture, if that matters to you.The other alternative is to use a dual head adapter, such as the Matrox DualHead2Go Digital ME Graphics eXpansion Module, for two of the displays. It presents multiple displays to the OS as a single surface so that only one display output stream is required. And if you really need to do all three of your displays from a single Thunderbolt port, you could daisy chain a Thunderbolt dock with a display output, a second dual-port Thunderbolt device, and then a Matrox DualHead2Go. Although that would be at least $520 worth of hardware by my calculations just to drive three 1920x1200 displays.
StrangerGuy - Tuesday, September 16, 2014 - link
A cable standard with virtually no consumer demand because they simply don't care how many separate cables are hooked up onto stationary boxes and no industry support because it reminds them of Intel's RDRAM strongarming tactics back in the late 1990s? Nope, won't fail at all.Just look at how spectacularly Intel failed to stealth pushing TB to OEMs with the Sandy Bridge non-USB3 chipsets when everyone just give them a big middle finger and went third party USB3.0.
Spunjji - Wednesday, September 17, 2014 - link
Or as in the server market just skipped USB 3.0 capability altogether for 4 painful years.repoman27 - Wednesday, September 17, 2014 - link
"Just look at how spectacularly Intel failed to stealth pushing TB to OEMs with the Sandy Bridge non-USB3 chipsets when everyone just give them a big middle finger and went third party USB3.0."Considering Apple negotiated an exclusive on Thunderbolt for the entire Sandy Bridge era, your conspiracy theory is a bit laughable. Basically, if you weren't Apple, your options for SNB platforms were to ship with USB 2.0 only or throw in a discrete USB 3.0 controller which was a very reasonable ~$5 adder. When IVB rolled around, everybody got the most performant USB 3.0 solution available included in the chipset at no additional cost, and also had the option of including Thunderbolt as a brutal ~$20 adder with zero ecosystem of devices approved for use under Windows. And you're telling me that was Intel's big play to push Thunderbolt to OEMs?
Intel was responsible for creating the xHCI spec, and while it annoyed many people that they refused to finalize the xHCI 1.0 spec until the tape out of the 7-series chipsets was completed, it's far more likely that they took their time because shipping 200+M chipsets with a show-stopping bug due to incomplete validation is something they simply couldn't risk. Given the competitive climate, or complete lack thereof, it wasn't like Intel really needed to take much in the way of risks period. Only 70M USB 3.0 enabled devices total shipped in 2011. Nobody had the kind of exposure that Intel did had they attempted to get USB 3.0 integrated in time for Sandy Bridge.
Rick83 - Tuesday, September 16, 2014 - link
Why are there no screens with integrated PCIe graphics enclosure on the market? I would imagine this to be a great combination, allowing easy docking of laptops with a single cable.For all those who are burdened with laptops, this would be enough to not have them get a desktop as well, for almost all use-cases (given a sufficiently powerful CPU that can keep itself cool)
I don't think this would add a significant cost to the already expensive thunderbolt displays, besides a higher spec'ed power supply, and an ugly case on the rear, as well as some circuitry.
Of course, as long as Apple is the ambassador of TB, this will not happen.
Shiitaki - Thursday, September 18, 2014 - link
Thunderbolt is a big deal to Mac because you can't put anything in a Apple computer. It's the only high performance connection available.Thunderbolt has been out now for 3 years, it's going no where fast in the PC world. Intel would have had to integrate it in to every Z77 and newer chipset to drive adoption.
It is hard to justify connecting your drives using a Thunderbolt enclosure when the internal connections are both free and faster. Thunderbolt has more latency than the built in Sata ports. Intel needs to visit Redmond and smack them around some. But then Intel carries much of the blame themselves. The Thunderbolt chip costs 10 bucks in bulk? I could buy a really nice hardware raid card for what the Thunderbolt enclosure I am going to buy costs, if only I could put a card in my iMac. But I have no choice.
It doesn't matter how cool the hardware is if their is no software to make it work. It doesn't matter how cool the hardware is if people can't afford it.
Right now I am looking at buying a Thunderbolt enclosure for 460 dollars because I love the Apple operating system that much. But I could build most of a PC for that money. And it doesn't come with a single HDD. And the saddest part is that no matter what, the performance will never match simply connecting the SSD I want to use to the Sata port on a motherboard. Thunderbolt doesn't have anything to offer to justify the cost, after three years I think it's obvious that Thunderbolt has already gone the way Firewire has. I find myself asking why 10Gigabit has to be so expensive, because if those interfaces were affordable, a SAN would be a good alternative.
soapbox - Wednesday, September 24, 2014 - link
The following in this article is a blatant LIE: "However, the drivers have been updated to enable 'Thunderbolt Networking'. This involves linking multiple PCs / Macs with Thunderbolt cables. A 10 Gbps network is automatically created (in the form of a 'dummy' network adapter)."This is not true. If you connect 2 PCs with thunderbolt NOTHING HAPPENS! Thunderbolt on PC is only good for certain storage devices. Please do not post lies like this.