Workstations

Is Cloud-Based CAD Ready for Prime Time? Part 2

8 Jul, 2014 By: Alex Herrera

Herrera on Hardware: Centralized, server-side graphical computing technology is on the upswing — but will it stay that way?


Where should the concentration of power for high-performance computing reside? The first part of this series retraced the basic history of client/server computing, in which the prevailing wisdom at times pushed that concentration toward servers in the datacenter and at others swung it toward desktop workstations. Today, IT strategies are in flux again, with many headed in a very clear direction: toward server-based computing (SBC) models that pack not only computation and data on the server but — for the first time — high-performance graphics processing as well.

Manifested in a range of IT solutions tagged with hot buzzwords — such as virtual desktop infrastructure (VDI), desktops as a service (DaaS), hosted virtual desktops (HVD), and, of course, cloud computing — SBC approaches the promised IT nirvana: anytime, anywhere access enabling across-the-globe collaboration, along with the ultimate in security and ease of management. By keeping models and databases resident in a central resource, IT managers are far better equipped to manage everything in this age of "big data." And most relevant to those relying on visually intensive CAD applications, this latest thrust in SBC is promising one more feature: interactive, high-performance 3D graphics.

Past Obstacles to SBC Viability

Despite an appealing list of advantages over traditional client-side solutions, including workstations, SBC approaches today are virtually nonexistent in CAD and other professional computing spaces — and for good reason. Previous attempts to centralize data, computation, and rendering have failed to deliver the workstation-caliber experience a CAD user demands. Rather, past SBC solutions have required an unworkable compromise: Secure the benefits of centralized computing or enjoy high-performance graphics computing, but not both.

While attractive in theory, it turns out delivering high-performance interactive graphics from a single, centralized computing resource to a remote client is anything but easy. Historically, two fundamental problems have stood in the way: excessive latency and a bandwidth paradigm particularly unsuited to remote visualization.

High-performance computer graphics processing is a notorious bandwidth hog, when it comes to both rendering and displaying the 3D image — and it's that attribute that has made SBC solutions a nonstarter for graphics-intensive computing. With a conventional client-side model, only model data is transmitted from server to client (and that perhaps happens infrequently, as a designer "checks out" a model from a project database). All pixel data stays local on the client. But in an SBC model, in contrast, the server performs rendering and transmits only the resulting pixel images to the client. Model data never leaves the server.


With a client-side rendering (left), model data is transmitted, but pixel data stays local to the client. With server-based computing approaches (right), model data remains on the central server, while pixel data traverses the network. Click image to enlarge.


That fundamental difference between the two models begs an obvious question: Which is the bigger bandwidth burden, a CAD model or the images of that model? Clearly, minimizing the traffic on networks is a critical goal, particularly in the earlier stages of communications infrastructure build-out in decades past. Back in the 1990s and into the ‘00s, the answer was crystal clear: it was the pixels, and by a wide margin. Limited by computers' processing, memory, and storage, models were relatively small in detail and size.

The pixel streams, by comparison, were huge — to transmit one 1,280 x 1,024 resolution stream at 30 Hz, for example, you'd need roughly 1.2 GB/second of raw (uncompressed) sustained bandwidth. Even with the modest-quality video compression feasible at the time, moving such streams from a central resource across local-area and wide-area networks was a tall order, and one that seemed wise to avoid, given the relatively modest size of datasets that could be copied to clients instead. When it came to addressing bandwidth demands, it made a lot more sense for high-demand computing applications such as CAD to stick with a desktop workstation that did it all: computation, rendering, and display.

That overwhelming demand on bandwidth made server-side rendering an unattractive technical proposition, but the bigger roadblock to past acceptance may have been excessive latency. Have you ever experienced a videoconference where excessive lag time between speaking and being heard gets everyone talking on top of each other? Now, imagine the same annoying lag between the time you move your mouse to change the view of a CAD model and the moment that the model actually rotates on the screen.

Interactivity demands a snappy machine response. Once the "round trip" latency — the interval between user input and visual response — gets up over 150 milliseconds (ms) or so, interactivity becomes problematic, and much beyond 200 ms, the system becomes downright unusable. After combining the multiple sources of latency introduced by the elements of an SBC approach — high processing demand, slower hardware, and immature networks — getting that round trip latency down to a reasonable range has historically been very difficult or very expensive.

 


Workstation-Caliber SBC May Be Ready for Prime Time

So it's no wonder that SBC technology never took hold in CAD in decades past, despite its theoretical benefits. But it's not 1994 or 2004 anymore. It's 2014, and a cast of the biggest names in client and server silicon, software, and hardware are pitching SBC computing once again, this time promising to deliver a workstation-caliber graphical computing experience. What's changed to allow this recycled approach to work now where it couldn't in the past? Quite a bit, especially with respect to those two key stumbling blocks of bandwidth and latency.

First, the volume of data in the average CAD model has exploded over the past decade, and is now measured in gigabytes and even terabytes. In 1994, a complex 3D CAD model might have measured in the hundreds of kilobytes; today, one representing a car or airplane could be a gigabyte or more. And in other professional spaces, such as oil and gas exploration, datasets can consume hundreds of gigabytes. Moving models around a network, from server to client, or among clients, is no trivial feat anymore. Rather, there's a tremendous computing advantage to be had in not moving or copying it at all, but instead keeping that data in one central repository.

Meanwhile, that other big consumer of bandwidth, the pixel stream, has been on a very different trajectory, growing relatively modestly over the past twenty years, especially when considering dramatic bandwidth efficiencies made possible by video codec advancements during the same period. Combine both trends, and now in 2014, the bandwidth burden has been flipped — it's no longer necessarily more difficult to move images/pixels across a network, and in many cases it could be the far lesser challenge.

And what of that other thorn in remote visualization's side, latency? Well, it's by no means an issue to be ignored, but given the dramatic maturation of network capabilities in combination with faster throughput for both clients and servers, providing a solution that exhibits a modest and deterministic degree of latency has become far more manageable.


In 1994 and 2004, it made more sense to transmit models (versus pixels) over capacity-challenged networks, but the script has flipped in 2014. Image courtesy of Jon Peddie Research. Click image to enlarge.


The Window of Opportunity Is Open

So while it made sense ten and twenty years ago to exclusively pursue a distributed computing topology that keeps data, computation, and rendering on clients, that conclusion is no longer the obvious one. Today, it makes more sense than ever to consider a server-based computing alternative instead: Leave the big data in a central datacenter, visualize it on a remote server, and ship only the pixels to a remote client.

However, a shift away from compute-heavy clients won't happen overnight, nor will it happen for all users. And furthermore, it's not going to happen with the type of servers that compose the bulk of today's datacenters. Rather, it's going to take a new breed of server, one that's as capable of handling the same type of compute-intensive and visually rich workloads that workstations always have. First and foremost, that means servers will need to adopt components they typically haven't before, most notably a graphics processing unit (GPU) comparable in performance to what workstations rely on today.

Enter a new generation of silicon, software, and systems designed to do just that, provided by some of the biggest names in modern computing — Nvidia, AMD, Intel, Microsoft, Citrix, VMware, HP, and Dell — with a few up-and-comers in the mix as well. If these vendors have their way, buyers, managers, and users of traditional workstations will find themselves on the cusp of a dramatic transformation in the way they address their computing and visualization needs. In the next part of this series, we'll look at these emerging solutions and how they'll likely fit into a landscape that has long been dominated by the workstation.


About the Author: Alex Herrera

Alex Herrera

Add comment

Note: Comments are moderated and will appear live after approval by the site moderator.

AutoCAD Tips!

Lynn Allen

Autodesk Technical Evangelist Lynn Allen guides you through a different AutoCAD feature in every edition of her popular "Circles and Lines" tutorial series. For even more AutoCAD how-to, check out Lynn's quick tips in the Cadalyst Video Gallery. Subscribe to Cadalyst's free Tips & Tools Weekly e-newsletter and we'll notify you every time a new video tip is published. All exclusively from Cadalyst!
Follow Lynn on Twitter Follow Lynn on Twitter


Poll
At your company, who has the most say in CAD-related software purchasing?
CAD manager
CAD users
IT personnel
vice-president/department manager
CEO/company owner
Submit Vote