The Origin and Evolution of the Modern Workstation30 Jan, 2019 By: Alex Herrera
Herrera on Hardware: Where did today’s workstation come from, and who — besides CAD users — relies on it?
For the 1980s and most of the ‘90s, the term workstation defined a very specific — and very different — purpose-built machine. Due to its evolution since, however, the platform now taps into much of the same technology DNA as the PC industry, making it harder to stand out based purely on physical specs. Still, during and despite its physical transformation, one constant has held true: The machine has always been shaped to best support the workloads and applications essential to visual professionals — first and foremost, CAD professionals.
Most people familiar with the marketplace around 1990 could point to a laundry list of generally accepted traits that separated workstations from mass-market PCs. Examples include CPUs that tapped proprietary RISC architectures, instead of the x86 designs that dominated IBM-compatible PCs; Unix operating systems versus the PCs’ DOS and Windows; and graphics hardware that was tailored to high-performance 3D modeling, rather than the 2D GUI accelerators that Windows spurred in PCs.
The machine that encapsulated those traits — based on proprietary RISC CPUs and operating systems, and homegrown 3D graphics accelerators — is what we have, in retrospect, termed the traditional proprietary workstation. Engineered, built, and marketed by workstation pioneers such as Sun, SGI, HP, DEC, and IBM, the traditional proprietary workstation ruled the roost when it came to supplying machines to drive rapidly expanding applications in CAD.
It held that top spot for good reason. Twenty-five years ago, it was ludicrous to think that a PC was a reasonable platform for handling professional applications such as CAD. Sure, buyers searching for a cost or price-to-performance advantage have always looked at alternatives, but there were simply too many dealbreakers: features and capabilities lacking that simply made the job far too slow, unreliable or, more often, simply impossible. But in the years since, the core silicon and technology supporting the PC market have infiltrated workstations.
The Rise of the PC-Derived Workstation
Over time, PC technology and components — driven by much higher revenue, much shorter product cycles, and many more engineer-hours in development — closed the gap with traditional proprietary UNIX-based workstations. And by the late 1990s, the differences in capability became small enough to overlook, while the advantages in price-to-performance ratios just got too big to ignore. For the bulk of applications, it began to make more sense to buy a workstation based on components either taken off the shelf from the PC world or derived from PC components. And the migration to the modern, PC-derived workstation was on.
Workstations sharing semiconductor DNA with the broader PC and server markets were gradually recognized not only as a valid platform for professional graphics and compute-intensive applications, but as the superior platform. Smaller-scale, in-house development teams at Sun, HP, SGI, and IBM couldn’t compete with the production and pricing that economy of scale enabled, or with the pace of progression of x86 CPUs from Intel and GPUs from NVIDIA and AMD.
The PC-derived model effectively exploits the monstrous investment in a common foundation of architecture and silicon technology. Rather than reinventing the wheel, the model recognized the wheel worked pretty well already, and for most purposes all it needed were a few enhancements (in some cases, very minor ones). At the entry level today, a PC-derived workstation may include nearly all the same components as a machine branded for corporate and consumer applications. And while the bulk of mid-range and high-end workstations today may not use identical components as the mainstream PC, they do use devices like Intel’s Xeon and NVIDIA’s Quadro — products that, to varying degrees, leverage the huge amount of technology and investment already in place to serve the mass market.
As the fixed costs for chip and fab development have climbed, the pricing benefit of high-volume semiconductors for PCs began undermining the business case for unilaterally developed, proprietary workstations beginning way back in the late ‘80s. The ‘90s claimed more than a few companies that hung onto their own architectures too long (e.g., DEC), though most had read the writing on the wall and were scurrying to make their platforms pervasive and/or open. By the turn of the century, any vendor making chips and operating systems based on its own architecture was hurting, as the competitive pressure from PCs and PC-derived hardware became overwhelming.
The economy of scale that began turning the tide in the late ‘80s: chip costs per unit volume (assumes $50 million fixed NRE). Image source: realworldtech.com
Running AutoCAD? Test Your Hardware! Designed to test and compare the performance of systems running AutoCAD, Cadalyst's Benchmark Test is a popular and long-time favorite.