Hardware Benchmarks for CAD, Part 2: Uses of SPEC and ISV Benchmarks
8 Jan, 2020 By: Alex HerreraHerrera on Hardware: Benchmark relevance will vary depending on what application you use and whether you’re focusing on graphics throughput or whole-system performance.
In the first part of this article, we discussed reasons why benchmarks remain the best performance evaluation tools, and explored some of their limitations. In this part, we’ll explain more about benchmark offerings from SPEC and their best uses, and we’ll touch on those from independent software vendors (ISVs) as well.
Assessing 3D Graphics Performance and GPUs
The SPECviewperf benchmark from Standard Performance Evaluation Corporation (SPEC) works by running through a series of canned 3D viewsets selected to be reasonably representative of the content and usage of designers and engineers running popular workstation applications. Several viewsets were extracted from digital media entertainment applications, and a couple from geoscience and medicine. But CAD gets the most prominent billing in SPECviewperf, as the most recent version (13) steps through the drawing, zooming, panning, and rotating of canned viewsets from SOLIDWORKS, Creo, Siemens NX, and CATIA. The content is representative of those applications’ usage, but it’s characteristic of other CAD packages as well, as a glance at the viewsets illustrates.
SPECviewperf 13 viewsets that were pulled from CATIA, Siemens NX, and SOLIDWORKS — but are certainly typical of project data from other applications as well.
The SPECviewperf benchmark intentionally focuses the stress on the graphics processing unit (GPU), rather than the system as a whole. Other system resources aren’t idle, of course, while running Viewperf — for example, the operating system (OS) and some Viewperf application overhead will consume CPU cycles — but for the most part, it’s the GPU hardware and driver that are taking the brunt of the computing stress. In some cases, the central processing unit (CPU) might be the bottleneck. In others it might be input/output (I/O), and in either case you may end up realizing that that GPU you were so happy with running Viewperf is either a non-issue or overkill, depending on how the rest of the system functions. As such, Viewperf shines when the goal is to compare GPUs rather than complete workstations.
So unless you’re solely focused on choosing a GPU (either to configure in a new machine or upgrade an existing one), you’ll want to get past measures that leave the rest of the system unstressed and look at a whole-system benchmark. That means a benchmark that can best simulate and measure how a complete CAD application with typical workloads will run.
Whole System Performance: Combining Heavy-Duty CAD Computation, I/O, and 3D Graphics
There are many general-purpose, third-party PC benchmarks available, such as PassMark. But while they aren’t likely to favor one hardware vendor over another, they also won’t necessarily be focused on the type of computing and 3D graphics that CAD users typically stress.
SPECapc and ISV-supplied or endorsed benchmarks. Ideally, a user who spends the bulk of the day running one mission-critical application could turn to a benchmark that runs that very application, and issues a bunch of the same types of tasks that the user regularly performs. That’s precisely the idea behind SPEC’s suite of SPECapc benchmarks, which run popular CAD-utilized applications including ones for 3ds Max, PTC Creo, Siemens NX, and SOLIDWORKS.
What makes SPECapc so powerful is that, unlike other benchmark alternatives, they execute the actual application, executing a sequence of fixed tasks that SPEC (or the ISV) has selected as typical and indicative of users’ real-world workloads for that specific application. As such, the SPECapc benchmarks stress all critical system components — not just the GPU, but CPU, memory, and storage as well — providing a better indication of how the whole system should behave when running your application. The caveat with SPECapc is that the application in question must be installed with a valid license (although trial applications may work). Also, be sure to pay attention to the software versions for both the SPECapc benchmark and the application it runs, as SPEC refreshes these benchmarks periodically, although not on any specific schedule.
Similar in use and approach to SPECapc, though neither independent nor broadly available, are performance tests that some ISVs may make available to their users. A popular example is the SOLIDWORKS Performance Test that Dassault Systèmes makes available for SOLIDWORKS users.
SPECwpc: An all-purpose, all-around professional computing benchmark. If your workflow depends predominantly on one or two applications represented in a SPECapc benchmark, SPECapc (or the similar ISV-authored test) should be your top choice in assessing hardware to drive that workflow. But what if you don’t see a SPECapc benchmark for your main workhorse application(s)? Or what if your typical workday includes a range of CAD-related tasks? In these cases, you’ll probably want to check out SPEC's most recently developed workstation benchmarking tool: SPECwpc.
Like Viewperf, SPECwpc doesn’t run a full application like SOLIDWORKS, which again is both an advantage and a drawback. The only noteworthy drawback is that SPECwpc doesn’t run the actual application such that one individual test stresses the entire system in a real-world environment. Rather, it focuses on individual tests that individually stress the key subsystems CAD workflows rely upon, step-by-step. For example, SPECwpc includes Viewperf viewsets to evaluate the 3D graphics performance, the IOMeter test to assess the bandwidth and latency characteristics of the storage subsystem, and a range of relevant, multi-threaded, computationally intensive algorithms to stress the CPU and memory.
What are the advantages? You don’t need an application license, and your results will likely reflect a wider range of capabilities to benefit broad computing requirements. And conveniently, SPEC bundles those individual tests into six suites, each representing common usage modes for six popular workstation verticals: Media and Entertainment, Product Development, Energy, Life Sciences, Financial Services, and General Operations. SPECwpc spits out a composite score for each of the six application suites. CAD professionals will be most interested in comparing scores for the Product Development suite.
Raw scores may not be the most relevant … what matters most to you? All these benchmarks will generate a composite score (or scores), statistically aggregating time-to-completion for a large number of tasks. The bigger the number, the faster the card performed on average across all the tasks weighed by the composite score. Buyers who place the highest priority on performance — and with generous budgets to match — may want to focus on the raw scores alone. But others, likely the majority, may prefer to focus on price-to-performance (scores / dollar) relationships, and some — particularly when it comes to a mobile workstation — on power efficiency (scores / Watt). Some, but not all, benchmarks will report the former, but you can calculate both easily from published prices and power specifications.
Canned Benchmarks Are the Best Practical Option, Despite Limitations
Benchmarks are valuable tools, allowing you to gain some insight on hardware best suited to handle your workflow most efficiently, thereby contributing to optimizing your productivity. But some are more appropriate than others, and their relevance will vary depending on what application you use and whether you’re looking at graphics throughput specifically or you’re more interested in whole-system performance.
Looking to upgrade your GPU? Then it may be best to check out both SPECviewperf results (posted on the SPEC site as well as by independent online publications) and SPECapc for your application, if available. Bear in mind that Viewperf alone may be misleading, not because it won’t give you an idea of how well the GPU can rasterize your typical content, but because its merits may be moot if the GPU isn’t typically the bottleneck for your workflows. Pairing Viewperf with SPECapc for your application provides a more comprehensive perspective. Are you looking instead to evaluate a whole machine to power your CAD workflow? If possible, run SPECapc or an ISV-authored test for your application, or check out submitted results for that application or one comparable. Tap SPECwpc to complement an application-specific benchmark, or treat it as your primary indicator if the latter is unavailable or impractical (e.g., no installed application or license available).
And of course, for most buyers, a single performance measure like a benchmark result is unlikely to be the end-all decision criterion; rather, it will serve as one piece of a multifaceted puzzle. Price, form factor, fit, power consumption, and special features (like stereo 3D or ray-tracing acceleration) may be as important as performance metrics, or more so. Above all, remember that benchmarks are inherently imperfect. They have their own objectives and strengths, as well as limitations, but taken in the right context, they remain the best options to ascertain performance for workstation hardware’s aptitude for modern CAD computing.
For Mold Designers! Cadalyst has an area of our site focused on technologies and resources specific to the mold design professional. Sponsored by Siemens NX. Visit the Equipped Mold Designer here!
For Architects! Cadalyst has an area of our site focused on technologies and resources specific to the building design professional. Sponsored by HP. Visit the Equipped Architect here!