Jon Peddie Back Pages - It's all about the pixels

From sharing to virtualizing in just under 60 years

Posted by Jon Peddie on November 12th 2014 | Discuss
Tags:

A technology whose time has come, has been with us for a long time  With its introduction in the 1960s and emergence as the prominent model of computing in the 1970s, time-sharing represented a major technological shift in the history of computing—many users being able to share one computer. The sharing was done on a time-sharing basis called a slice. Later developments modified that approach to load bearing and priority interrupts, but the basic concept of sharing a single resource was unchanged. (The concept was first described publicly in early 1957 by Bob Bemer (CHK) as part of an article in…

Future visions in the third quarter

Posted by Kathleen Maher on October 29th 2014 | Discuss
Tags: idf ptc q3 google io

The third quarter this year was a good quarter for looking at future technologies apparently. Many of the companies we follow held either their large event of the year or strategic meetings in which they unveiled secret plans. In many cases we were treated to extravagant technology promises. Google I/O of course was wild with magical watches and instant 3D envi¬ronments. But, the company to watch is PTC, which is steadfastly reinventing itself. The company is building tools for developing IoT products and you know, that’s just brilliant. Everyone is talking about the millions and millions of these things out there…

It’s time to get excited about 4K

Posted by Jon Peddie on October 22nd 2014 | Discuss
Tags: tv hd 4k plasma uhd tv lcd

The days of being a denier are over There are three things needed to get a market going: demand, supply, and affordability. When a new market segment is opened, the world pretty much splits into two main camps: naysayers and enthusiasts. Within each camp are the curious and the fearful. When 4K was introduced, the naysayers were quick to announce it was too expensive, there was no content, and there was no content delivery system. They also might have said there was limited supply, and a lot of them said you can’t see the difference, or why do we need this…

Made any 4K videos lately?

Posted by Jon Peddie on October 8th 2014 | Discuss
Tags: 4k

Do you remember when YouTube first hit the scene? Do you remember people saying, Who the hell wants that? Why would I want to look at some dopey home video? How will they ever make money? and all the other usual tripe that narrow-minded people spout when confronted with something new and different. Today, of course, they’re wishing they had bought shares in the company then.  Since then it’s gone from an $11 million startup to a company valued at $40 billion and in the process made a lot of smart moves and bets. One of them is its 4K channel.…

Are graphics worth it, do they matter?

Posted by Jon Peddie on September 23rd 2014 | Discuss
Tags: nvidia gpu amd apple samsung cg movie

How would you measure it? In my travels I’ve been in various discussions of late about the value of graphics—are they important? Depends a lot on the content is the short answer. The example I use in my university lectures is the beautiful Aki in Final Fantasy: The Spirits Within (2001).  The graphics rendering in the first full-length CG film broke new ground in realism, and the characters looked fantastic—even more so given it was 13 years ago. But the graphics couldn’t save the movie because it had no story, or at least no story anyone cared about. The movie industry…

CG and CV are black holes for processing power

Posted by Jon Peddie on September 9th 2014 | Discuss
Tags: peddie flops mip black holes cg dreamworks

Too much, good enough—nonsense   Some of you reading this may know I’ve postulated a few axioms over the years, all of them about scale in one way or another. One of my favorites is my first: In computer graphics, too much is not enough—1981. It was true then, and it’s true now. It’s also why I get so tired of the question, but haven’t integrated graphics caught up? No. There is no catchup. You can’t catch up. You’ll never catch up.  A friend of mine more famous than I, Jim Blinn, also has an axiom: Blinn’s Law: As technology advances,…

The PC isn’t dead—I told you so

Posted by Jon Peddie on August 27th 2014 | Discuss
Tags:

What’s next? In terms of economic recovery, and overall growth trends, the PC market is a lot healthier than many others—  automobiles, for example. However, although the PC market recovered faster than autos, it got gob-smacked by the impact of tablets. Seeing the downturn in sales, the sharpshooters on Wall Street drew a straight line and cleverly predicted the PC would be dead and gone by 2020. Those were the same bright folks who created derivatives that sent the world economy into a nosedive; definitely the folks we want to be listening to. Fortunately, the world beyond Wall Street wasn’t paying attention…

We’ve crossed the line, and there’s no turning back

Posted by Jon Peddie on August 12th 2014 | Discuss
Tags: peddie augmented reality computer graphics ar cg

What’s in a name?  The Holy Grail in computer graphics is the suspension of disbelief—to tell such a convincing story with pixels that the viewer not only totally believes it, but thinks he or she is in it, a participant, voluntary or not. We’ve had such experiences in the cinema for a long time. A story is told, and we become so engrossed with it that when there is a shocking moment like the Alien popping out of an unexpected place, or a FedEx airplane falls apart, we duck, scream, or worse. The images stay with us for decades like the…

Work anywhere any time

Posted by Jon Peddie on July 30th 2014 | Discuss
Tags: nvidia amd intel hp dell ibm professional graphics engineers

Share your GPU, use your colleagues We truly have entered the era of ubiquitous computing. It started in the 1960s with time-share computers, expanded in the late ’70s with the commercialization of the APRANET into the Internet, and further developed in mid-2000s as the concept of the cloud became universal. The final step was virtualization, and specifically virtualization of the GPU. Virtualization of the GPU has had many fathers. The first commercially available example of it was 3Dlabs’ virtualization of code space in the GPU in 2004 in the Wildcat VP. Also in 2004, Imagination Technologies enabled a single core to…

Learning how to count

Posted by Jon Peddie on July 16th 2014 | Discuss
Tags: apple arm q1 pcs

It would be a lot easier if you weren’t such a skeptic In mid-June, Intel said, due to stronger than expected demand for business PCs, it expects second-quarter revenue to be $13.7 billion, plus or minus $300 million, as compared to the previous range of $13.0 billion, plus or minus $500 million. Intel now expects some revenue growth for the year as compared to the previous outlook of approximately flat, driven mostly by strong demand for business PCs. The company will provide additional commentary on all business segments when it reports second-quarter earnings on July 15. That, no doubt, helped Gartner…