Augmented reality showing us stuff that’s not there
Posted by Jon Peddie on January 15th 2013 |
Soon we won’t know the difference
When augmented reality (AR) was first introduced isn’t clear. (Damn, that’s such a great pun.)
But seriously folks, the idea of AR is pretty old, and can be traced back to L. Frank Baum’s short story, The Master Key, written in 1901, in which he introduced the idea of AR, I give you the Character Marker. It consists of this pair of spectacles. While you wear them every one you meet will be marked upon the forehead with a letter indicating his or her character. Baum, like Asimov, had many advanced ideas and is worthy or a read (or re-read).
Probably your fist encounter with AR, the seeing of things that aren’t there, was when you saw a weather map on TV (which in the early days was done with a green screen). The other popular AR implementation was (and still is) the scrimmage line in American football. But implementations date back to 1956 when Morton Heilg created the Sensorama.
I’ve had the pleasure of communicating with Mrs. Heilig while doing researcch on my up-coming book on The History of 3D in Computers.
And although not named yet, (that would be done in 1990 by Tom Caudell, a researcher at Boeing), Ivan Sutherland is largely credited with introducing the concept of AR in 1968. Sutherland, and his student Bob Sproull, created the first Virtual Reality (VR) and Augmented Reality (AR) head mounted display system at the University of Utah.
By 1992 the concept had worked its way into the mainstream of computer graphics the first major paper on an AR system prototype, KARMA, at the Graphics Interface conference, delivered by Steven Feiner, Blair MacIntyre and Doree Seligmann present. That was followed by a 1993 widely cited version of the paper published in Communications of the ACM - Special issue on computer augmented environments, edited by Pierre Wellner, Wendy Mackay, and Rich Gold., and soon AR had become part of the Siggraph and CG lexicon.
More recently, last summer IBM unveiled an augmented reality mobile app that lets the user point their phone at store shelves and receive personalized product tips, recommendations, and coupons (http://ibmsmartershopping.tumblr.com/).
Today we can use our phone to translate menus and street signs, do real time currency conversion, and see historical markers that don’t exist. Later this year we will have the opportunity to wear Frank Baum like glasses from Google and others, and who knows, maybe cloud data will be so good and accessible that we will be able to see a character’s personality, or at least his or her resume.
I have my own AR desires. Everyone has seen beautiful ray-traced pictures of cars (see example). What if you used the front facing camera to superimpose the things in its view (you, the place you’re in, the lights, etc.), on the highly reflective surface of the car?
I want to lean in toward the screen and see my face reflected in the car. I want to take my tablet outside and see the car in my driveway. And, I want 3D control of the car so I can move it around in my driveway. And no jittering. OK, get to work.
Previous entry: The workstation market finds its groove in Q3’12