Someone just sent me an email and asked if I thought Intel might buy Nvidia now that Larrabee is dead. I would have just answered it and then disregarded it if I hadn’t gotten a phone call asking the same dumb question.
Intel won’t buy Nvidia for the following reasons:
Larrabee isn’t dead – there will be a Larrabee graphics chip, based on x86 architecture. There will be a whole family of Larrabee chips. Wishful thinking won’t make Intel or its ambitions go away. The company has, and continues to make, huge investments in the graphics technology and space.
Intel believes all esoteric architectures, of which they include the GPU ASIC, will fade away and only the X86 architecture will prove to be universal. It has endured for the past 40 years. As a recent proof, Intel points to the Cell processor.
The cultural differences, acrimony, and belligerences between Intel and Nvidia run so deep it would be impossible to blend the organizations without a few homicides.
It’s unlikely, regardless of how big Intel’s checkbook is, that the two companies could ever agree on the price.
The Nvidia BOD and shareholders of Nvidia would never approve a friendly acquisition by Intel, and Nvidia has a multi-voting technique that would delay any hostile attempt for over a year.
If Intel could buy Nvidia, one of the first things it would do would be to dump the ARM-based Tegra product just as they dumped the ARM-based XScale product, which they did because they think the x86 has a more promising and scalable future. Given the huge goodwill they’d have to pay to get Nvidia, selling off an asset at a breakeven point at best would hardly endear the company to Wall Street or its shareholders.
Intel doesn’t need Nvidia
But most important is the fact, and it is a fact, that Intel doesn’t think it needs Nvidia. The company has all the graphics IP it needs from Imagination Technologies, plus its own labs. It’s not that Intel couldn’t build a GPU, but rather that the company doesn’t see today’s GPU architecture as having long legs – they don’t think it will scale and it certainly can’t do MIMD. -To Intel, it is a dead end and why invest in that? I need to say this again because it really is a critical difference in the basic philosophies of the two companies. None of the events of the past week have had anything to do with hardware design. Yes GPUs are hard to design. So are CPUs. So is any billion transistor part. Intel simply doesn’t see a future for the conventional SIMD GPU architecture. Right or wrong, that’s where their analysis leads them, and you can huff and puff about it all you want, Intel is not going to change its mind on that matter.
It’s a naïve speculation
It’s naive to evaluate the computer industry as through it was a chess board and say if White takes bishop then Black has to take queen. It just doesn’t work that way, never has. Remember the rumors and speculation floating around when AMD bought ATI. Then the smart folks all knew for certain that Intel had to buy Nvidia. Most of the same reasons prevailed then as they now as to why that was absurd.
Then remember a year or so later when AMD’s fortunes looked bleak and the smart people knew for certain that Nvidia would buy AMD. Uninformed, unsophisticated, historically unfounded conclusions based on a bowling alley score card. The PC industry isn’t sport. If you want to forecast the industry you better understand its working parts, the history of its people, and the technologies within it.
Never say never
Another thing you learn if you’ve been in this industry a while is to never say never. So with that precaution I guess I can’t say Intel will never buy Nvidia, but if they do it won’t be the Nvidia we know today.