How would you measure it?
In my travels I’ve been in various discussions of late about the value of graphics—are they important?
Depends a lot on the content is the short answer. The example I use in my university lectures is the beautiful Aki in Final Fantasy: The Spirits Within (2001).
The graphics rendering in the first full-length CG film broke new ground in realism, and the characters looked fantastic—even more so given it was 13 years ago. But the graphics couldn’t save the movie because it had no story, or at least no story anyone cared about. The movie industry learned not to waste their budget on imagery at the cost of other important storytelling elements, and shifted to cartoon-like characters in Polar Express (2004), Monsters vs. Aliens (2009), and the string of animations from Dreamworks and Pixar. In 2014, Appleseed Alpha was released, based on a game, and here again it looked great but had no story unless you were a fan of the game.
However, the graphics are now really great, and that wasn’t lost on the game developers and engine builders. The engines were there, so let’s exploit them.
Those game developers exploited the machines available to them: the mighty PC, the less mighty but surprisingly good consoles, and even tablets and phones. However, like the movies (and music market), it is a “hits” business, and if you bring out a crappy game, it doesn’t matter how much graphics power, screen space, or CPU power you have— bad is bad. Unless you happen to be making Transformer movies, in which case all logic goes down the tubes.
The game developers try to mitigate their costs and risks by bringing out games for all, or at least many platforms. And then something happened— users, in particular console users, voted with their dollars. Not a new phenomenon, but this time it was really striking. The PS4 started to pull ahead and then ran away from the pack. What was going on? Many of the games available to PS4 were also available to the Xbox One and the Wii U. What was different?
It’s the graphics, Jake. Da graphFix.
All three units use the GPU design from AMD, and all three were introduced within weeks of each other. The Xbox was more expensive, but the Wii U was lots less expensive, and Microsoft dropped their price to be more competitive, and yet the PS4 continued to pull forward. What was different? I’ll tell you what. The Wii U has a 320-core GPU, and it has sold the least number of units. The Xbox One has 768 cores, and the PS4 tops the chart with 1,152 cores—one and half as many as the Xbox, and 3.6 as many as the Wii U. Hmmmm.
I’m sure there are a hundred other factors involved, and they will surely be pointed out to me by the Wii U and Xbox One folks, fans, and apologizers, and I welcome those insights. But for a simple-minded guy like me, who is easily distracted by bright, shiny things, it’s really hard to ignore the correlation between good graphics and great sales.
The Core M GPU discussion in this issue makes the point that Intel has done a U-turn and given more of its precious 14-nm silicon to graphics than its crown jewel X86 processor—what’s that all about? Simple: we look at screens, so if you want to win our eyeballs and therefore our wallets, you better put something great on the screen in front of us.
Apple and Samsung get this and you can see it with what they lead with— screen size, resolution, and, gosh, graphics power—imagine (no pun).
So I’m going to go out on the limb here and suggest graphics really are important. And yes, yes, of course, story and content are too—the most important elements, in fact. But if you have a great story, game, movie, whatever, and it looks cheesy, clunky, or compromised, we’re out of here, ya hear. So if you want to sell stuff, you better get your graphics game on.