Blog

Early or late, big or little, high or low?

Where you are on the curve—depends on the product You’ve all seen the classic technology adoption curve (top right).  It’s OK, easy to understand, and to draw, but I think it’s wrong when it comes to computers (and I include smartphones in that categorization for this discussion). I think it’s more like a Poisson distribution (middle right). We will all ...

Robert Dow

Where you are on the curve—depends on the product

You’ve all seen the classic technology adoption curve (top right). MONITOR POPULATION and introduction dates.

It’s OK, easy to understand, and to draw, but I think it’s wrong when it comes to computers (and I include smartphones in that categorization for this discussion). I think it’s more like a Poisson distribution (middle right).

We will all be on different parts of the curve depending on our need (for a faster computer, a higher resolution screen, and more powerful graphics AIB, etc.), our desire (same list), our budget, and the availability of the product (which could be a hand-me-down).

It’s driven me crazy to see such slow development of availability of higher resolution monitors. The panel makers like LG, Aoc, and Sharp emphasized production of HDTV panels, and the PC industry got them as a stepchild. The monitor suppliers are offering a few higher resolution monitors, but they are disproportionally priced on a PPI or screen size basis—the large-volume production is mostly in HD. And it’s that volume that has driven down the price, increased the availability, and convinced the consumers that what’s available is good enough. It is not.

The move to UHD—4K will change that over time, but the panel makers don’t get any

demand from consumers. The average consumer would be just fine using an XGA 1024 x 768 PC monitor. (Not only that, we’ve seen users with higher resolution monitors running them at 1024.)

And even though we and others have proven time and again that productivity is improved with higher resolution screens, corporations allegedly interested in increasing production and thereby reducing costs can’t get past the investment part—you have to buy newer, more expensive monitors.

As some of you know, we’ve been living in 4K land here at JPR for the past few months. And as all of you know, I am an extremist when it comes to pixel irradiation—can’t have too many. My current setup is a 1200 x 1920 portrait to the left, a 4K in the center, and a 2560 x 1600 monitor to the right for a grand and glorious glow of 14.5 megapixels. As soon as I can, I’m going to replace the right-hand screen with another 4K and move the 2560 to the left. I am getting a little tight on desk space— and boy, is it bright in here.

So I, and many of my colleagues in JPR land, are the bleeding edge, but that’s our job. What about the other folks, the real people? Monitors just aren’t that expensive, and they last forever. So an investment in one is probably spread over five years at the least.

And yet, we have a condition in the industry that looks like the chart at the bottom.

This is not hard data, just the best estimate I could generate surveying various web sites. But I’m comfortable with it, and worst case it’s off by 10%.

The slow uptake on screen resolution, or multiple monitors for that matter, is not due to the graphics engines in PCs. We just ran two 4K monitors on an Ivy Bridge with HD4600 and it was fine. We benchmarked games on it.

The discrete GPUs have been able to support ultra-high-resolution and multi-displays since 2006 (at the latest)—who has a GPU that is eight years old? I doubt it would be anyone reading this.
So where are you on this curve? (I’d truly like to hear from you.)

And where are you on the curve for your laptop, your workstation, phone, tablet, game console, etc., early or late, big or little, high or low?