Graphics Cards

How Fast Is It? Assess Your Graphics Hardware

21 May, 2014 By: Alex Herrera

Herrera on Hardware: Although benchmark tests may be imperfect tools, they are a valuable source of performance data.


For many reasons, CAD is an indispensable technology for those in the businesses of design, engineering, architecture, and construction, but its fundamental value boils down to one thing: its capability to visualize in the virtual domain what has yet to exist in the physical one.

Herrera on HardwareSo it's not surprising when I hear users and software suppliers alike call out the graphics processing unit (GPU) as the most critical component in their hardware toolbox. The productivity boosts CAD promises can only be realized when your hardware delivers detailed and accurate renderings, with a graphics system that delivers snappy, interactive visuals.

The responsibility for delivering on that promise falls primarily in the lap of your workstation's GPU. Therefore, it's especially important that during your next hardware shopping mission, you choose a GPU that will stand up to all the rendering demands that you'll make of it.

But what's the best way to assess and compare the performance of GPU cards and graphics-focused workstations? There is no perfectly accurate method. Rather, it becomes an exercise in triangulating among several data points, while always keeping in mind the context and characteristics of your specific application, workflow, and compute loads.

Don't Expect a Best-Case Scenario

GPU spec sheets contain a dizzying array of numbers, often referencing esoteric metrics most graphics hardware users don't understand. All the numbers indicate where a card may fit on the performance curve. Unfortunately, there's a fundamental problem with virtually all these results: They are theoretical hardware limits, indicating performance levels that can be reached in very specific circumstances, most of which are not realistic when running real-world applications.

Each of these numbers taken alone means nothing, unless all the other salient architectural metrics can deliver to similarly capable levels. For example, a GPU that promises x gigaFLOPS (billion floating-point operations per second) of computing throughput may, in practice, deliver only a small fraction of x if its input/output, instruction pipeline, and memory subsystems can't feed the internal engines with data quickly enough.

Even more recognizable specs, such as triangles per second or pixels per second, aren't grounds for apples-to-apples comparison. They can hint at a rough performance level, but they are generally too subjective in definition and too specific in usage to be a broad indicator of speed.

A Benchmark Primer

For most CAD users in the market for new hardware, benchmarks represent the most widely accepted and reasonable way to get an idea of the performance a prospective product will deliver when running the applications and workloads that matter most: yours. You must tread carefully though, because benchmarks are just one piece of the performance prediction puzzle and, like marketing specs, are often misused and misinterpreted.

 

To avoid any possibility of vendor bias, I focus on benchmarks designed and maintained by independent third parties, such as the Standard Performance Evaluation Corporation (SPEC). SPEC is the longtime, independent torchbearer for workstation-caliber benchmarking, governing three sets of benchmarks that apply specifically to highly visual, professional-caliber applications: SPECviewperf, SPECapc, and SPECwpc. Find information, see product results, and download benchmarks at SPEC's website.

SPECviewperf. This benchmark isolates the stress on the graphics card exclusively, rather than the system as a whole. As a result, its scores reflect the capabilities of the GPU only and do not (or at least should not) materially reflect differences in other key system components such as CPU, memory, and storage. It streams predefined viewsets representing the typical visual demands of several widely used CAD applications, including PTC Creo, Dassault Systèmes CATIA and SolidWorks, and Siemens NX.

Because SPECviewperf probes to determine the GPU's rendering limits for each viewset, it conversely does not conclusively indicate where the actual performance levels would be for a complete system. It may do so, or it may not, depending on where other system bottlenecks may appear during real-world processing. For example, if your memory subsystem tends to be the limiting factor when manipulating large models, the fact that your GPU limits are much higher is essentially a moot point.

SPECapc. SPEC's other two sets of workstation-focused benchmarks extend the scope beyond theoretical GPU limits to how a whole system will perform on application tasks similar to those that compose your workflow. SPECapc tests run the actual application, executing a sequence of fixed tasks that SPEC has deemed typical and indicative of users' real-world workloads for that specific application. As such, the SPECapc benchmarks stress all critical system components — not just the GPU but CPU, memory, and storage as well — providing a better indication of how the whole system should behave when running your application.

Mechanical designers and engineers will most likely want to focus on SPECapc tests for 3ds Max 2011, PTC Creo 2.0, Siemens NX 8.5, or SolidWorks 2013. Be sure to pay attention to the application versions on each test, as SPEC refreshes these benchmarks periodically, although not on any regular schedule.

SPECwpc. Perhaps you don't see a SPECapc benchmark for your main workhorse application? Or, maybe your typical workday includes a range of CAD-related tasks? Then you'll want to check out SPEC's newest workstation benchmark, SPECwpc. Whereas the SPECapc benchmark (and Cadalyst's offering; read more below) focus on performance when using one specific application, SPECwpc delivers a wider perspective, incorporating a test suite for each of six popular workstation verticals: Media and Entertainment, Product Development, Energy, Life Sciences, Financial Services, and General Operations. The new benchmark reports a composite score for each of the six application suites. CAD professionals will be most interested in comparing scores for the Product Development suite.

Cadalyst Benchmark Test. Did you notice a key omission in the list of SPECapc application benchmarks above? There is currently no SPECapc test for AutoCAD, the most popular CAD application in the world. That's where another third party comes in: Cadalyst.

Cadalyst created and maintains the foremost benchmark for AutoCAD, with its current c2012 version supporting AutoCAD v2000–2013. (A new version of the benchmark will be released by summer 2014 and will support AutoCAD through version 2015.)

Like SPECapc, the Cadalyst Benchmark Test runs the full application to create, manipulate, and render predefined models. As such, the test stresses all components — not just the GPU — to indicate a level of performance your system should deliver when running basic AutoCAD. For more information and to download, visit the Cadalyst Benchmark Test page.

Weighing the Numbers

All these benchmarks will pump out a composite score (or scores), aggregating time-to-completion for a large number of tasks. The higher the score, the faster the card performed on average across all the tasks weighed by the composite score. Buyers with the most demanding performance expectations — and with generous budgets to match — may want to focus on the raw scores. But others, perhaps the majority, may want to weigh the price-to-performance ratio (score/dollar) and power efficiency (score/watt). Some benchmarks will report the former, but both can be calculated after the fact.

One Piece of the Puzzle

While benchmark results are worthwhile data points, avoid the temptation to base your hardware selection on individual benchmark results alone. In fact, it probably makes sense to neither start nor end your search with benchmarks, because the fact that one card or system comes in slightly ahead of another won't tell the whole story. Of course, in the extreme case where one simply blows away a competitor of similar price on the benchmarks most relevant to your workload, you'll probably want to avoid the laggard. But it's much more likely that the two will deliver similar scores, so you'll need to look at other comparable criteria, which can be just as important in determining which product will deliver best on your ultimate goal: getting more done in less time.


About the Author: Alex Herrera

Alex Herrera

Add comment

Note: Comments are moderated and will appear live after approval by the site moderator.

AutoCAD Tips!

Lynn Allen

Autodesk Technical Evangelist Lynn Allen guides you through a different AutoCAD feature in every edition of her popular "Circles and Lines" tutorial series. For even more AutoCAD how-to, check out Lynn's quick tips in the Cadalyst Video Gallery. Subscribe to Cadalyst's free Tips & Tools Weekly e-newsletter and we'll notify you every time a new video tip is published. All exclusively from Cadalyst!
Follow Lynn on Twitter Follow Lynn on Twitter


Poll
At your company, who has the most say in CAD-related software purchasing?
CAD manager
CAD users
IT personnel
vice-president/department manager
CEO/company owner
Submit Vote