Blog

From sharing to virtualizing in just under 60 years

A technology whose time has come, has been with us for a long time  With its introduction in the 1960s and emergence as the prominent model of computing in the 1970s, time-sharing represented a major technological shift in the history of computing—many users being able to share one computer. The sharing was done on a time-sharing basis called a slice. ...

Robert Dow

A technology whose time has come, has been with us for a long time 

With its introduction in the 1960s and emergence as the prominent model of computing in the 1970s, time-sharing represented a major technological shift in the history of computing—many users being able to share one computer. The sharing was done on a time-sharing basis called a slice. Later developments modified that approach to load bearing and priority interrupts, but the basic concept of sharing a single resource was unchanged. (The concept was first described publicly in early 1957 by Bob Bemer (CHK) as part of an article in Automatic Control Magazine.) 

 

Time-sharing exposed a well-known, emperor’s-new-clothes kind secret: even in the 1960s we had more computing power than we knew what to do with, so sharing it was efficient. However, the slow I/O devices were all text-based, whether paper or CRT.

By the 1970s we were getting graphical. Computers could draw pictures, and we wanted that. But it was a zoo, and no two systems were the same in terms of operating system, terminal type, or language. To overcome those structural limitations, some bright people at MIT developed the X Window system in 1984. X provides the basic framework for a GUI environment: drawing and moving windows on the display device and interacting with a mouse and keyboard. X is an architecture-independent system for remote graphical user interfaces and input device capabilities. Each person using a networked terminal has the ability to interact with the display with any type of user input device.

Apple and Windows took over the market as we moved to personal computers, and the idea and need to share resources was forgotten, except in the supercomputer area where resources were still precious.

But the need to collaborate and share information never went away, and the use of graphics only intensified. However, it was extremely difficult to share a highly rendered 3D model with someone else in real time. HP offered its Remote Graphics (RGS) capability in 1984, and although it worked fine, the bandwidth just wasn’t there to make it viable for a large number of users. However, HP never gave up, and today their RGS is a viable contender.

While the industry struggled with limited bandwidth, we developed alternative solutions for getting an image from point A to point B, and that was to simply send the coordinate date and reconstruct it at the client. But then a funny thing happened. The models kept getting bigger (lots and lots of coordinates and materials information), while at the same time bandwidth improved. The net result was it became more economical to ship an image than the model.

That was the tipping point for remote graphics in the server and the virtualization of the GPU.

REMOTE GRAPHICS implemented via VDI in graphics GPU.With virtualization of the GPU, the user now has the choice of one GPU per user, multiple GPUs per use, or multiple users per GPU. And that can be allocated on an as-needed workload basis. What’s more, with the VDI—the virtual device interface—this then allows the server to send its results to almost any device that can get on the Internet— making it possible to work anywhere, anytime, on any machine.

Not only can the user work anywhere anytime, he or she can allocate his or her machine, with its data, to other people to access. Virtualization, as implemented via a VDI, now makes all machines and all databases ubiquitous, universal, and utilizable. The load-balancing opportunities are astounding; all those cores we’ve been buying and let¬ting sit idle now can be harnessed, each to its ability and each to his or her need. This is the new wave of computing and computing demoralization.