NVIDIA Analyst Day: Jen-sun Goes to the Mat With Intel
by Derek Wilson on April 11, 2008 12:00 AM EST- Posted in
- GPUs
Introduction
Hot on the heels of the launch of their 9800 series products, NVIDIA is holding a Financial Analyst Day. These are generally not filled with the normal technical glitz and glitter an Editors Day has, but the announcements and material covered are no less important to NVIDIA as a company. NVIDIA has an unusually large institutional ownership rate at 84% (versus 79% and 66% for AMD and Intel respectively) so the company holds these Analyst Days in part to keep its institutional investors happy and well informed about the company’s progress.
As far as we members of the press are concerned however, Analyst Days are a valuable chance to learn about the GPU market, and anything that could impact the bottom line can help us understand NVIDIA's direction, motivation, and even the reasoning behind some of the engineering decisions they make. Today saw a lot of posturing for battles to come, and we were not disappointed.
Waking up the Beast
Most of the morning was dedicated to NVIDIA taking some time to do a little PR damage control. They've stepped out to defend themselves against the doom and gloom statements of other players in the industry. With Intel posturing for a move into the graphics market and proclaiming the downfall of rasterization and discrete graphics at the same time, NVIDIA certainly has reason to address the matter.
And we aren't talking about some standard press release boiler plate filled with fluffy marketing speak. This time, Jen-sun Huang, the man himself, stepped out front and addressed some of the concerns others in the industry have put forth. And he was out for blood. We don't get the chance to hear from Jen-sun too often, so when he speaks, we are more than happy to listen.
One of the first things that Jen-sun addressed (though he didn't spend much time on it) is the assessment by Intel's Pat Gelsinger that rasterization is not scalable and won't suit future demands. He largely just threw this statement out as "wrong and pointless to argue about," but the aggregate of the arguments made over the day all sort of relate back to this. The bottom line seems more like Intel's current approach to graphics can't scale fast enough to meet the demands of games in the future, but that speaks nothing about NVIDIA and AMD's solution which are at least one if not two orders of magnitude faster than Intel graphics right now. In fact, at one point Jen-sun said: "if the work that you do is not good enough … Moore's law is your enemy."
This seems as good a time as any to address the tone of the morning. Jen-sun was very aggressive in his rebuke of the statements made against his company. Many times he talked about how inappropriate it is for larger companies to pick on smaller ones through the use of deceptive marketing tactics (ed: Intel is 11.5 times as large as NVIDIA by market cap). To such attacks, he says "It's just not right!" and "we've been taking it, every single fricking day… enough is enough!" NVIDIA, Jen-sun says, must rely on the truth to carry its message in the absence of massive volumes of marketing dollars.
Certainly, things can be true even if they paint a picture slightly different than reality, but for the most part what Jen-sun said made a lot of sense. Of course, it mostly addresses reality as it is today and doesn't speculate about what may be when Larabee hits the scene or if Intel decides to really go after the discrete graphics market. And rightly enough, Jen-sun points out that many of Intel's comments serve not only to spread doubt about the viability of NVIDIA, but will have the effect of awakening the hearts and minds of one of the most tenaciously competitive companies in computing. Let's see how that works out for them.
43 Comments
View All Comments
segerstein - Saturday, April 12, 2008 - link
As I read the article, but I wasn't wholly convinced about the arguments made by the CEO. As we have seen with EEE and other low cost computers, the current technology was about serving the first billion of people. But most people still don't have computers, because they are too expensive for them.Nvidia, not fully addressing even the first billion, because of its expensive discrete solutions, will see its market share shrink. Besides, there are many consumer electronics devices that would benefit from a low powered "System-on-a-chip".
Intel has Atom+chipset, AMD bought ATI precisely because it want to offer low powered "System-on-a-chip" (but also multicore high performing parts).
It would only make sense for Nvidia to buy VIA. VIA Isaiah processor seems promising. This was they could cater to a smaller high-end market with discrete solutions and to a growing market for low cost integrated solutions.
BZDTemp - Saturday, April 12, 2008 - link
Seems Nvidia does not like to be in the receiving end.I do remember Nvidia spreading lies about PowerVR's Kyro 3D cards sometime back when it looked like they might have a chance to be the third factor in 3D gaming hardware.
With ATIAMD in crisis I think it's great that Nvidia and Intel start competing even though I sincerely hope ATIAMD to come back strong and kick both their asses. After all I can't recall the red/green guys using unfair tactics and like to see integrity rewarded.
Finally I would Anandtech to be more critical when reporting from such venues. Try and google Kyro Nvidia and pdf to find the old story or just check out the pdf directly: ftp://ftp.tomshardware.com/pub/nvidia_on_kyro.pdf">ftp://ftp.tomshardware.com/pub/nvidia_on_kyro.pdf
duron266 - Saturday, April 12, 2008 - link
"Jensen is known as a very passionate, brilliant and arrogant guy but going against Intel on a frontal full scale might be the worst thing that they ever decided. Nvidia went from close to $40 to current $19.88 which means that the company has to do something to fix this but this is simply too much."duron266 - Friday, April 11, 2008 - link
NVIDIA...too high-profile,if they were going to vanish,
Jen-Hsun would be the number one to blame...
anandtech02148 - Friday, April 11, 2008 - link
there's a huge differences when audio is being process on a many core cpu like intel and on a stand alone pci card.putting the pci card in you can feel the cpu less bogged down, and the motherboard chipsets generating less heat.
An Integrated gpu, audio, and many cores doesn't solve the problem, there will be bandwith issues too.
Nvidia should hit Intel hard with a low powered, high performanced gpu, to prove a point.
epsilonparadox - Friday, April 11, 2008 - link
NVidia will never be able to compete on the low power arena with intel. Intel just have a better process and fabs for that process. NVidia has other companies building their chips. Plus graphics chips don't go with a new process like cpus do.poohbear - Friday, April 11, 2008 - link
very nice article, but how many of us are gonna understand the analogy:"someone's kid topping of a decanted bottle of '63 Chateau Latour with an '07 Robert Mondavi."
wtf is that?!? im guessing he's talking about wine, but whatever.
kb3edk - Friday, April 11, 2008 - link
Well of course it's a wine reference, consider the audience: Institutional investors. These are people who are much more likely to spend $500 worth of disposable income on a bottle of Chateau Something-Or-Other instead of a GeForce 9800 GTX.Also note the Mondavi reference because they're in Napa just on the other side of San Fran from nVidia HQ in Silicon Valley.
And it's still a bit odd seeing such strong words from nVidia against Intel considering that nVidia/Intel is the main enthusiast platform out there these days (as opposed to an all-AMD solution).
Khato - Friday, April 11, 2008 - link
Really quite enjoyed this, makes me all the more confident in the products Intel is currently developing.I mean really, how similar does NVIDIA's ranting sound compared to AMD's back when they were on top? No question that they're more competent than AMD, but they've done just as good a job at awakening a previously complacent beast at Intel. Heh, and they've never had to compete with someone that has a marked manufacturing advantage before...
tfranzese - Sunday, April 13, 2008 - link
Intel is no beast in these parts. Their track record in the discrete segment and drivers up to this day is complete failure. Until they execute on both the hardware and software, both monumental tasks, they'll continue to be right where they are in the discrete market (i.e. no where).