How and when did the GPU come to be? The answer to those questions and others pertaining to the graphics processor unit are revealed in a series of three books by Jon Peddie, president of Jon Peddie Research, who shares his extensive knowledge of the industry and how this device has become a vital part in today’s world.
What is in a smartphone or smartwatch, which is also in a supercomputer, a car, airplane and spaceship, a robot, a TV, every game machine and almost any screen you’ve ever seen? A GPU—a graphics processor unit.
Jon Peddie has just written a three-volume series of books, published by Springer, that traces the history, the ecosystem and environment, and new developments of the GPU.
These are the first and only books to cover the entire history leading up to the GPU from the origins of concepts in the 1950s to the supercomputers and smartphones it powers today.
The books show how companies started, failed, merged, and were acquired. How IP shifted around the world and through thousands of hands, and who many of the leading developers and managers were who did it.
Eighty-four companies from 3dfx to Zhixin and everyone you’ve heard of in between are discussed, as well as many of the people who started and ran those companies.
The GPU has gone from an esoteric semiconductor of interest to gamers and CAD designers to a part of everyone’s life. It powers our smartphones, automobiles, virtual and augmented reality, video games, and AI training.
Where did these ubiquitous and amazing devices come from, and when? The concepts started in the 1950s. In the 1970s, large-scale integration (VLSI) was developed, giving us the microprocessor and new peripheral support devices, and chips, to drive displays.
GPUs have two major components: the geometry processor and the rasterizer. The geometry processor goes back to floating point processors developed in 1981. The rasterizer is where the pixels get created, polished, and used to create the beautiful images we see on our screens. Graphics boards were built with geometry processor chips and rasterizer chips until 1999, when Nvidia merged the two into a VLSI device and called it a GPU. It was a race Nvidia barely won, as a half dozen other firms were after the same prize.
The race has continued at breakneck speed. The GPU has grown in power, performance, size, and price, and it powers most of the computational workloads in servers in the cloud. At the same time, the GPU has shrunk to fit in our smartwatches and smartphones.
Come follow the journey from the cold war to fantastic cinematic experiences, from driverless cars to spaceships, and from robots to smart home devices and appliances. Learn how and why you can play games on a smartphone, game console, arcade machine, or your PC—no math or programming examples are included or required. This is a story about the technology and the people who brought it to us.
Contact [email protected] for more information.
Keywords:
AI
AMD
Card
Chips
Computer graphics
Display Port
Gaming
GDDR
GPU
HDMI
History
Intel
Nvidia
PCI
Processor
Semiconductors
SIMD
Simulations
VGA
VLSI
VRAM