What an incredible week this has been for Nvidia. It started out with them introducing the most revolutionary new architecture in a decade and ended with stories about them getting kicked out of the chipset business, and giving up the gaming market.
Fermi is an amazing piece of technology—period. The Nvidia haters (they are legion I’m told) have been quick to point out that it is overloaded with silicon not needed for gaming, and is big, and won’t be available in time for this year’s holiday madness. And actually much of that is true—but wait—there’s more.
Back in the days I built hot rods we would buy the biggest engine car we could afford—and then strip everything off it we possibly could to focus its horsepower on the issue at hand—being first off the line and to the finish line. We didn’t start with a tiny engine and try to build it up. So off came the air conditioning, out went the power steering, the bumpers were pulled, the back seat thrown out, and we’d even pull the gas tank and replace it with a can—this car only had to go 1,350 feet.
Fermi is the same kind of thing. If there’s stuff in it that’s not needed for gaming don’t you think Nvidia is smart enough to know how to make a stripped down derivative? But before you can start stripping down, you gotta have something big and powerful to work with.
Getting out of the game business—I don’t think so. Getting out of the lime light since they won’t have a new product for the holiday—you betcha. Don’t read too much into those tea leaves.
Chipsets
And then there’s the sad news about them being driven out of the chipset business. This story has more spin on it than quantum procession processor. The first words began to appear in January 2008 about the rift between Intel and Nvidia over chipsets. When Intel reveled their high-speed (25.6 GB/s) HT-like QPI CPU serial interface they told Nvidia to not even think about making an X58 clone part because Nvidia didn’t have a license to the new interface. The two companies flustered and blustered in the press and on the web until Intel filed for a declaration in May that prohibited Nvidia from saying they had a license. That was an unusual move, but seemed to be effective and the noise level from Nvidia quieted down except for a few “yes we can too” murmurs.
After Computex, in August 2008, “Nvidia hater” Charlie Demerjian, working for the Inquirer at the time, declared that Nvidia was exiting the chipset business, had shut down development in their Shanghai lab (that they acquired from ULi in 2005) and reassigned the troops there.
In December Nvidia renamed the FSB-based GeForce 9400M as Ion and went after Atom sales.
Last week statements from various Nvidia people, mostly PR, set the web hive buzzing, every Webster was trying to be the first to post the most shocking story about Nvidia’s abandonment of its chipset business. Boy I’d hate to be an inside PR person at AIN (AMD-Intel-Nvidia), running around all the time stomping out fires, issuing corrective statements, trying to get the spin right—awful ulcer creating job, but I digress.
Somebody at Nvidia was asleep at the switch that’s for sure. Letting word get out about halting any business or product is the same as the Osborn effect (announcing a new product before you have it, effectively killing your current sales.) What’s the first thing consumers do when they hear such a story—avoid buying the product because it’s dead. What’s the first thing OEMs do when they hear such a story? Stop ordering parts. Gwak! So if you’re going to EOL a part, you do it quietly and in private with your OEMs—not in public forums that get multiplied by the minute.
After Nvidia realized the flap the hard working and underappreciated PR folks leapt into action and started issuing corrective statements.
On Intel platforms, the NVIDIA GeForce 9400M/ION brands have enjoyed significant sales, as well as critical success. Customers including Apple, Dell, HP, Lenovo, Samsung, Acer, ASUS and others are continuing to incorporate GeForce 9400M and ION products in their current designs. There are many customers that have plans to use ION or GeForce 9400M chipsets for upcoming designs, as well.
We will continue to innovate integrated solutions for Intel’s FSB architecture. We firmly believe that this market has a long healthy life ahead.
We expect our MCP business for both Intel and AMD to be strong well into the future.
We expressed our view on this in March. And to anyone reading this who’s interested I’ll let you have the report at a 50% discount, give me a call (415-435-9368) or send an email [email protected]
Nvidia just needs a little time, and a little less press exposure (they could help that by shutting up for a while.) Things are in motion, Fermi is coming along, it will be a good game engine. Nvidia will still make chipsets for as long as AMD or Intel make them. And they are going to take some lumps until then.
Epilog
Ion 2 should launch in a few months, or Q1 2010. It should have much faster graphics and much more shaders. Since the current Ion has 16 shaders the new one may have more than 32. It should be faster, and may be built in 40 nanometer. This will improve gaming for netbooks and nettops and Cuda applications will run faster. Apple will use it with CULV notebooks no doubt.