cadalyst
Workstations

Who Is Using Workstations, and Why?

14 Aug, 2019 By: Alex Herrera

Herrera on Hardware: CAD is the primary driver of today’s professional graphics hardware market, but not the only one. See how CAD users’ workstation needs compare to those of software engineering, digital media, geoscience, and other professionals.


In stark contrast to the mainstream consumer and corporate PCs that make up the broader computer markets, workstations continue to thrive. While the former struggles with contracting volume and waning demand, workstation sales continue to climb, outpacing not only PCs overall but GDP by a significant margin.

The broader PC market’s troubles stem from headwinds conspiring to depress demand for new machines. Simpler alternative devices, such as smartphones and tablets, have captured the minds of more casual consumers who find them adequate for minimal-demand applications like e-mail and web browsing. But more than that, what’s softening PC demand is the simple fact that many mainstream users can now go longer between machine replacements than ever before — much longer. If your four-year-old machine is doing the job just fine, you can forget the expense and hassle of buying and setting up a new machine every 18 months.

For workstations it’s a whole different story, one that benefits both the suppliers and users. The majority of professionals in manufacturing, design, architecture, and construction need the application-tuned performance and reliability of a workstation, and their insatiable demand for computation and visualization continues to justify a more frequent upgrade cycle (two to three years, on average). Unlike the mainstream market, the workstation’s traditional professional customer base continues to realize a positive return on investment (ROI) when upgrading to a newer model that completes their current modeling, simulation, and rendering tasks more quickly, along with support for the latest features exploiting developing technologies, like ray tracing or machine learning.

But while other professionals share the benefits that come with today’s wide breadth in workstation offerings, CAD users tend to reap the most rewards — and for a simple reason. While workstations serve all kinds of computing spaces, including digital media and entertainment (DME), finance, geoscience, and medical applications, CAD accounts for significantly more workstation sales than any other, with nearly half of total sales. And given its relative dominance, vendors tend to spend more time and dollars optimizing products for CAD than any other application. Still, it’s worth understanding who else out there is buying these things, why, and how different applications, priorities, and workflow all contribute to the evolving DNA of the modern workstation.


Workstations serve a range of professional client computing markets, but none bigger than CAD. Data source: estimates by Jon Peddie Research.


Digital Media and Entertainment Can Demand More Than CAD

When it comes to professional graphics applications, digital media and entertainment (DME) might come in behind CAD in volume (at around 15–20%), but the category is certainly first in mindshare. Responsible for everything from Hollywood computer-generated imagery (CGI) to the hottest 3D games, DME artists and animators look to workstations to provide the optimal balance of price, reliability, and most importantly, performance and compatibility with their mission-critical applications like 3ds Max, Maya, Houdini, Blender, and LightWave 3D.

Generally speaking, while DME pros rely heavily on much of the same hardware — fast, multi-cored CPUs and high-performance graphics processing units (GPUs) for modeling and 3D visualization — their content and workflows can also put more stress on system components than typical CAD users might. Producing studio-quality CGI, for example, will typically entail working with ultra-fine-grained geometry, meaning there are millions of polygons per scene to process instead of thousands. While CAD users will often appreciate photo-quality renderings, for DME types it’s more often a must-have, so all that finer-grained geometry will get lit and shaded with more esoteric, physically accurate models. And since artists are going to great lengths to dial up the geometry detail, they’re not going to tolerate lackluster textures ruining the effect. So for multiple channels of ultra-high-resolution texture, for example, it can take hundreds of gigabytes of texture — far more than typical manufacturing or architectural models would ever require — just to bring a single character to life.

Even rendering that’s not intended to be photorealistic might vary between DME to CAD applications. Where an architect may be looking for sketch-style rendering to use in early conceptual visualizations, an artist or animator might employ 2D toon-style graphics. Fortunately, while the shading algorithms will differ, a workstation-caliber graphics drive from a vendor like NVIDIA or AMD will optimize for both.

Common usage for DME and CAD doesn’t end with rendering of the scene, because just as lighting and shading techniques have grown more complex — and more compute-intensive — so too have the methods used to create physically accurate animation. Characters exhibit true-to-life motion, water flows and smoke rises naturally, hair and grass bends with the wind, collisions and explosions are realistic. In that sense, DME animation tasks exploit much of the same workstation functionality as fluid dynamic simulations would in engineering simulations for analyzing turbulence from an airplane wing.


Particle and fluid simulation represent overlap between DME and CAD visual computing. Image source: NVIDIA.


Software Engineering’s Demands Are Less Visual

Software developers, who make up about one-tenth of all workstation buyers, share some of the same motivations as CAD users — but with one significant difference. While some buyers might place special value on the workstation’s prowess in 3D graphics and rendering, others have no visual computing demands at all. Rather, their goal is to secure other hallmarks of workstations, like scalability and reliability. A workstation purchased to develop database software, for example, might be configured to the hilt with CPU cores, memory, and storage, but end up paired with a relatively wimpy GPU — unlikely to compare to a machine dialed for the typical CAD workflow, but perfectly equipped for the software engineer’s needs.

Finance Calls for More Hard-Core Computation Than Meets the Eye

Finance is not the first space that comes to mind when thinking of a showcase for high-performance professional graphics hardware. That’s not to say financial markets don’t have unique and mission-critical graphics requirements — they do, in the form of large displays, low power consumption, and small form factors. But while reliability, power efficiency, and a high-resolution, multi-monitor setup in a tiny package are valuable and valid reasons to choose professional graphics hardware, as in software development, most financial applications haven’t traditionally had a compelling need for a higher-performance GPU.

Even with limited visualization demands, workstations are a compelling buy for finance users, to tackle tasks whose performance and precision can have a dramatic effect on the bottom line of both a financial firm and its clients; examples include calculating pricing and risk models for options and other derivatives.

Because these non-trivial tasks have huge potential implications for profit and loss, competing firms are pushed to concoct their own that-much-more-advanced formulas in the aim of gaining an edge on the trading floor. As a result, the models they employ are complex, math-intensive and exhaustive, incorporating a range of methods including Monte Carlo simulations, stochastic modeling, and finite element analysis (similar in approach to the FEA employed in CAD applications). Each can require billions or trillions of compute cycles, multiplied by as many iterations as necessary to converge on a solution, or simply run for as long as time and available CPUs will allow. And perhaps more so than in any other industry, getting the information even fractions of a second earlier can make a big difference in the bottom line. So not only are a multitude of CPU cores and even second CPU sockets valued, but increasingly a high-performance GPU is as well — less for visual processing and more for GPU computation of algorithms that lend themselves well to massively parallel execution.

Moreover, all that low-latency throughput has to be performed reliably and with unquestioned precision. Bit errors are rare, but they can and do happen. Mess up one pixel, one scanline of a triangle, or even an entire frame in a game, and it’s quickly forgotten, if detected at all. But other professional applications don’t share the same tolerance for error. The impact of bad data in medical applications, for example, goes without saying. And while it’s not life or death in finance, the impact of bit errors can be quite severe: A permanent error propagating its way through a real-time pricing model could devastate a financial institution or trading exchange. No surprise then that error-correcting code (ECC), found only in workstation-class CPUs and GPUs, makes a natural and valuable candidate to drive mission-critical applications for financials, as well as critical medical, scientific, and government operations.

Medical Imaging Mandates Multimodal Visualization with Unquestioned Integrity

Whether ray tracing or raster, either visualization approach boils down to the rendering of 3D surfaces. But not all 3D content is composed of surfaces. In professional spaces such as medical and geoscience, it’s much more important to see what’s hidden beneath the surface: a CT scan in the hospital or a seismic reading of a potential oil field, for example.

Three-dimensional medical imagery isn’t generated synthetically, but from scans. Like ray tracing, volume rendering broadcasts rays from the viewport into the volume, intersecting samples along the way, each of which may absorb or reflect all or part of the light’s intensity and spectrum. And like ray tracing, volume rendering is extremely compute-intensive, stressing CPUs, GPUs, memory, and storage, leading buyers of 3D imaging systems to choose workstations over conventional PCs. On top of that, multimodal medical records come in a wide range of visual types: 2D imaging with 10-bit precision in both grayscale and color, as well as conventional polygonal 3D and live video. Finally, a workstation-only feature like ECC is likely to be a must-have, given the need for the ultimate accuracy possible.


A volume visualization of a human thorax assembled from CT scans. Image source: Mikael Haggstrom.


Geoscience Needs to Navigate Terabytes of Data

The costs of drilling new oil or gas fields are enormous, in both dollars and time, so knowing where to look is job one. Searching for new reserves demands detailed, exhaustive, and compute-heavy signal processing, modeling, and 3D visualization — and that’s just the computation. Factor in the overwhelming amount of data involved, and it’s easy to see why few businesses can match oil and gas exploration when it comes to its demand on IT.

Like 3D medical imaging, oil and gas surveys can produce mountains of volumetric data. Single data sets can push well beyond 1 TB, the sheer magnitude of which invariably leads to system bottlenecks, due to lack of sufficient storage, bandwidth, or I/O (or most often, all of the above). Reducing data requirements — through decimation, coarse filtering, or severe constraints on scope — is an accepted practice, not so much out of desire but more out of sheer practicality, dictated by the limits of computing solutions that remain lacking. Striking the right balance between wider scope with less detail and a narrow view with more detail is one of the biggest challenges facing seismic interpreters, as they scour haystack after haystack in the hopes of finding the needle: a multimillion- or billion-dollar reserve of natural resources.

Workstations remain the obvious go-to computing tool for geoscience, which demands maximum expansion capability across all hardware metrics: CPUs, GPUs, memory, and storage. Complex models need to be calculated, involving advanced filtering of seismic waveforms and integration of multiple sample types to analyze. And in the specific case of the GPU, seismic analysis is one of those cases where there’s simply no substitute for brute-force engineering; massive visual data sets can stay resident (or at least more resident) in the form of bigger, faster frame buffers like the 16+ GB video memories attached to workstation-class GPUs from AMD and NVIDIA.


Seismic data visualization and interpretation is a critical and fundamental task in geoscience computing. Image source: NVIDIA.


CAD’s Dominant, and That’s Good for Suppliers and Users

The core constituents of the workstation’s served markets have change little over the past couple of decades, with CAD, DME, finance, medical, and geoscience applications representing the lion’s share of users. But new spaces are emerging that benefit from the workstation’s superiority in scalability, visualization, precision, accuracy, reliability, and professional computing performance. Among those, data analytics and machine learning are seeing rapid adoption of workstations for many of the same reasons as the historical application base. The demand for precise and efficient analysis of the exabytes of data acquired daily on the Internet is obvious, as the businesses that can best exploit that data will thrive and the ones that can’t will fade.

Still, in the context of all other existing and emerging workstation-served markets, CAD reigns supreme, in terms of both volume and breadth of support. And that’s a good thing for designers, engineers, architects, and product stylists — the best of both worlds, really. With more than twice the volume for workstation and professional graphics than the CAD community can muster on its own, vendors and their products will be that much more competitive to win business. But as the dominant served market, and by a large margin, CAD users know that their needs will tend to receive top priority among those providers, both today and for the foreseeable future.


About the Author: Alex Herrera

Alex Herrera

Add comment

More News and Resources from Cadalyst Partners

For Mold Designers! Cadalyst has an area of our site focused on technologies and resources specific to the mold design professional. Sponsored by Siemens NX.  Visit the Equipped Mold Designer here!


For Architects! Cadalyst has an area of our site focused on technologies and resources specific to the building design professional. Sponsored by HP.  Visit the Equipped Architect here!