This week’s issue of Tech Watch looks at the inflection point we are entering with the end of the IGP and the introduction of the IPG—Integrated Processor Graphics. The question is, or at least one important question is, how does the IPG affect the discrete GPU? We also take a look at the latest, greatest GPUs from ATI and Nvidia, and we reviewed some massively multi-monitor systems. All of these developments are pushing the edges of conventional computer componentry and our concepts of what is normal. It all adds up to what’s called an inflection point or disruptive technology by pundits who prefer popular platitudes to regular English.
These technological developments present the consumers, scientists, enterprise, and the suppliers, with some interesting challenges to their current behaviors. Most of the new developments discussed in this issue were built because they could be, not because there was a serious unmet demand for such technology. In some cases it’s part of the normal evolution of technology, in others it’s an attempt at differentiation, and in the case of error-correcting memory in the GPU, an actual requirement to enter the super computer club.
In the case of the dying IGP, there are many who will not miss the durable little, often maligned, processor. It presented its own disruption to the way things used to be. It also brought tremendous value and adequate satisfaction to hundreds of millions of users, and forced the techno-priests to consider the concept of good enough and the rate of change of hardware technology versus the adoption of that technology by the software applications and consumers.
As for the IPG, it was, as proponents of Moore’s law profess, inevitable. Integration of adjacent functions to the most important device is not only to be expected, but planned for. It simply is a race to see who can get there first and write some if not all of the rules for the followers. That’s called competitive differentiation, and so far no one has really claimed that flag of honor.
However, AMD is expected to capture the flag first when they formerly introduce their Fusion architecture embodied in the first instantiation as their Llano processor. If it is as powerful an integrated heterogeneous processor as is expected, it will further encroach on the discrete GPU’s territory just as the IGP has. The difference this time will be the level of performance and the range of applications. The irony of it is AMD is doing exactly what is taught in MBA courses—be your own worst competitor.
Big viz changes the way we work
With the seemingly unstoppable march of technology’s evolution, particularly in the hardware side, we see, literally, amazingly inexpensive massive display systems being introduced. And although there is a small cadre of large scale visualization users who need and encourage such developments, they do not represent a large enough market opportunity to justify the development costs for these huge display surfaces. This is truly a case of doing it because you can, and hoping that if you build it they will come. I think the developers are going to be pleasantly surprised. Further support for that forecast comes from Samsung, which has invested heavily in the development of monitors that can be tiled with the absolute minimum of interference by the construction of the monitor. Other approaches with curved surface displays will challenge the common wisdom on the delivery techniques of stereoscopic imagery—perhaps we don’t need alternate views of the parallax, maybe high resolution and wide peripheral coverage will be like the old IGP and be good enough. If it is, (and Imax may have already proven that case even before the widespread use of stereo 3D) then it becomes a universally attractive solution that just needs to be pushed down the economy of scale curve to make it affordable. In short, large-scale images that more than fill a person’s field of view, provide a sense of being surrounded and have depth. A six monitor, single board display system of edgeless monitors is a good start, and two boards with twelve monitors could be the ultimate manifestation of such an implementation—no glasses required.
All of these developments are pushing the edge of our heretofore-comfortable computer commune. They will challenge our thinking about what’s needed, necessary, and what’s normal. And slowly we will adopt these developments just as we moved from CGA monochrome displays to HD displays in our laptops, PCs, and TVs. It won’t happen nearly as fast as the hardware developers want, and too fast to suit the software suppliers (who would like the entire industry to be like a game console—stable and unchanging for five to ten years so they can maximize their ROI), and it will be confusing for most
consumers.
The evolution of the computer industry has never been smooth and orderly, and let us hope it never will be because that’s when it will stagnate. So bring on the chaos, it’s good for us to get pushed to the edges and out of our comfort zones.