GTC 2014: A Virtual Showcase for Tomorrow's CAD Tools27 Mar, 2014 By: Alex Herrera
Nvidia’s GPU Technology Conference demonstrates that the computer graphics industry can never have enough applications.
And if bringing simulation and visualization together is the ultimate goal, you can’t do any better than the case where the simulation is the visualization. Ray tracing has always represented the ultimate in photorealistic rendering for product styling. Unfortunately, it’s also been the most demanding in the way of computation, and therefore the slowest to use. Although ray-tracing algorithms render a scene, they do so in a manner that’s very different from the way GPUs typically render raster-based graphics.
Better characterized as a GPGPU application, ray tracing mimics how the rays of light in a scene travel in nature — bumping into objects and light sources, creating shadows, imparting color on each object. Nvidia’s own GPU-accelerated raytracer, iray, is integrated in several CAD applications, including RTT, Autodesk 3ds Max, and Dassault Systèmes CATIA, to name a few. Kick off a photorealistic visualization in these bread-and-butter modeling-and-viewing packages, and the Nvidia GPU in your system will automatically accelerate the rendering.
Can you tell which is the photo and which is the iray rendering? I couldn't.
Shared Server-Side GPUs: Extending VDI and Cloud Services to the Professional Ranks
What Nvidia once code-named Monterey Technology was on the GTC agenda as far back as 2010, at that time in the context of technology under development. Two years later, the company first positioned the technology as a product, introducing it as Virtual Graphics Technology, or VGX. Tack on another two years of maturation — and a new name, GRID — and Nvidia’s remote visualization technology is in 2014 poised to disrupt several markets, including CAD.
Virtual desktop technology offers enterprises a range of compelling benefits — benefits already enjoyed by many users and applications with more modest visual demands. With GRID, Nvidia is essentially aiming to deliver those proven benefits of a virtual desktop infrastructure (VDI) to an entirely new class of user: professionals who demand high-performance graphics computing.
With GRID, Nvidia is moving GPU processing from the deskside client to the backroom server, promising a high-performance, interactive visual experience, regardless of whether the server resides in the local campus data center or is part of some cloud service halfway around the world. Each host-resident virtual machine running on the host can share access to one or more server GPUs. Users at various points in product development, manufacturing, procurement, and even marketing workflows can then view and manipulate a singular, central database, rather than viewing multiple copies of the model stored on local clients. They can do so anywhere, anytime, and on virtually any device.
The GRID vGPU promises workstation-caliber VDI computing anywhere, anytime, on any device.
In the past year, Nvidia has been ramping up a supporting ecosystem for GRID, primarily with ISV partner Citrix, whose XenServer solution now supports GRID virtual GPU technology. At GTC ’14, the company bootstrapped the GRID ecosystem up substantially, with the addition of arguably the most desirable of VDI partners: Vmware. Alongside Huang, Vmware CTO Ben Fathi announced that the company’s ESX-based Horizon View VDI and DaaS (cloud oriented desktop-as-a-service) platforms would be supporting GRID vGPU technology. With key VDI ISVs on board, along with major names in server infrastructure (e.g. HP, Dell, and Fujitsu) on board, Nvidia is looking at 2014 as the breakout year for GRID technology.
Visual VDI with desktop performance sharing server-side GPU resources … sounds perfect, especially to an IT manager tasked with maintaining a robust, productive, and dynamic CAD environment. So what’s the catch? Well, those utopian predictions must be tempered by one practical concern: latency. More than any other performance issue — including frame rate and display resolution — excessive latency could turn an otherwise productive interactive visual experience into a time-wasting horror show. If you’ve ever been in a videoconference where the time lag has everyone talking over each other, you know what I mean.