Jon Peddie Blogs

Larry’s Bee - Part Two

Posted by Jon Peddie on August 7th 2008 | Discuss
Categories: The Market,
Tags:

Last week I posed a postulate that Intel could justify the investment in Larrabee (Larry’s Bee) on the basis of obtaining some level of parity with the incumbents, mainly ATI and Nvidia. And in my rush to post and then catch an airplane, I included the total discrete graphics semiconductor market, not just the desktop discrete market – sigh. OK, I’ll eat a little humble pie, give you the right numbers but more importantly suggest something bigger.

First the numbers

I mistakenly used the total GPU (desktop and notebook) in my calculation of Intel’s TAM – although the charts revealed the real numbers.

In any case the desktop potential in 2010 is 100 million units. I’m not including notebooks because I don’t think Intel will have a Bee mobile version by 2010.

So, same speculation, they hit parity by end of 2010, gives them a TAM of $3.3 billion, a little less than the $4.6 Billion I mistakenly quoted.  But read on….

Larry’s bee maybe more

Some of you have seen my forecasts on the expansion of the discrete GPU market due to expanded use of Crossfire and SLI down to the mainstream level, the deployment of hybrid architectures (with integrated and discrete GPUs), and GPU-compute.

Now I want to suggest that Larry’s Bee will have a synergistic impact, and not be just a spoiler for ATI, Nvidia, and others.

Not just a spoiler

Intel has been steadily leaking information about Larry’s Bee for almost two years. Some might call that salting the mine. Others suggest it is a deflection and FUD program designed to defer purchases of current parts and make the consumer wait for the Bee. Not that many, if any, would put such tactics beyond Intel, but I think there’s more going on than even Intel may realize. Intel will by the very nature of its marketing muscle and brand strength attract new attention to the 3D world and gaming in particular.

Regardless of the quality or performance of the Bee, Intel’s brand will raise the awareness of PC gaming, and give the industry increased credibility as a viable entertainment vehicle, and as a safe bet for a consumer’s purchase. No doubt my friends at ATI and Nvidia are not going to be happy with this legitimization of the Bee as I’m suggesting it will happen, but they should.

Why?

Because as the incumbents they (ATI and Nvidia) have made the industry what it is today. Others like 3Dfx, Matrox, and S3 also contributed in the past, but it was ATI and Nvidia who kept at it, had the passion, and made the investments. Their work has brought new gamers to the fold, and that has paid off handsomely for them. In fact, the numbers show why Intel, the only company with the resources needed to enter the market wants a piece of the action.

Intel also wants a piece of the action because as the graphics vendors expand their influence on users and increase the ability of accelerate functions beyond gaming, Intel is in danger of seeing the CPU over shadowed by the GPU.

At Jon Peddie Research we think there are still more people who are curious about PC gaming and would like to try to it, but just a little timid. They’ve tried the free stuff from the web and what comes bundled with a new PC and while entertaining for a while it usually doesn’t hold their interest. However, FPSs, racing, RPGs, and RTSs games would. The action, the cinematic quality of the images and in some cases the stories are attractive. Something is needed to push them over the edge. Something to get them to invest the extra $100 to $300 to get a good enough graphics board to handle the richness of today’s games. Intel will be spending a lot of money to convince them that they want to invest in a good, gaming capable machine.

and the Nintendo

And least we forget, look at how the Wii resonated with people who never thought they’d be interested games.

The vixen Vista

We think Vista is a model for such consumers. Why Vista? Because Vista introduced the notion that to use it you had to have good graphics. Rather than being a turn off and scaring people away, it was a catalyst to new PC sales and to a certain extent aftermarket sales of graphics AIBs.

If a boring, do nothing entertaining operating system can have that kind of effect, which was largely due to Microsoft’s marketing effect combined with some of their OEMs, imagine what the impact of an Intel could be on PC gaming.

So we did

We put our heads together, looked at all our models, especially the ones in our new Gaming PC Market reports we’re introducing soon, and decided that Intel’s entrance, combined with the support of the game publishers and developers (who will lavish praise and testimonials on the Bee and Intel), plus Microsoft’s and the OEMs (Dell, HP, etc) support will make the market jump a full five percentage points, or something north of 1.5 million new users growing to 10 million in three to five yerars

Newbees not loyal

I don’t think once these new PC gamers take the plunge they will remain loyal to Intel, or anyone else. The Bee is going to have to earn and keep the loyalty of these newly hatched gamers. As they get involved with PC gaming and learn more about the ins and outs, they will change allegiances annually at the least. Some new gamers in fact will enjoy tuning their machines joining communities, debating the merits of various hardware rigs and they’ll be the first to jump.

And yeah, Intel is going to steal some market share from the incumbents, that always happens and sticking your head in the sand, or putting too much effort in discrediting the Bee isn’t going to have much if any effect on the its impact. It’s at best a distraction, and at worse a drain.

Discuss this entry

If Larrabee was a traditional GPU, ATI and NVIDIA would have some reason to worry, as Intel has great manufacture and fabrication technology.  If Larrabee was from some company other than Intel, just because of its novel and flexible substrate, ATI and NVIDIA would have some reason to worry.

But the real reason for NVIDIA and ATI to worry is that Intel also makes the x86 chip that sit right next to today’s GPUs.  In its first incarnation, Larrabee is a discrete GPU.  But, in its next iteration, some of the Larrabee cores will be integrated on the main processor die, replace Intel’s integrated graphics.  In that domain, every PC would have reasonable graphics (say, within 5x or 10x of the best GPU available).

This pushes NVIDIA into the very high-end discrete GPUs market and the low-end embedded GPUs for mobile devices.

ATI/AMD has the same advantage that Intel has, but they haven’t embraced the opportunity as Intel has.

By ArchitectureProfessor on 2008 08 08

JON - That’s a good point, and Intel has said we (the industry) should expect processors with thousands of cores. So if some of those cores can run some graphics functions, then that is the ultimate integrated CPU.

But, Intel won’t be alone - AMD will beat them to market with integrated CPU & GPU, and for adding more x86 cores, AMD beat them to market on that too.

Needless to say all this proves is what ATI and Nvidia have been saying for the last 12+ years - graphics is where it’s at.

By jon peddie on 2008 08 08