CAD Tech News (#125)15 Apr, 2020 By: Cadalyst Staff
The annual conference — which was transformed this year from an in-person event to GTC Digital — made it clear that traditional 3D graphics is just one piece of a big and expanding puzzle.
By Alex Herrera
It's become a rite of spring: Every March we've come to expect Nvidia to put on its best dress, not only ready to impress its own customers, but hopefully to get all stakeholders invested in similar technologies and markets to follow suit. Once again, the GPU Technology Conference (GTC) went on as scheduled, but it looked nothing like it ever has. Amid the COVID-19 pandemic, Nvidia scrambled to transform the event — which typically hosts tens of thousands at the San Jose Convention Center — into a digital event streamed worldwide.
Although the format changed dramatically, as did the duration (Nvidia opted to release presentations, training sessions, and panels over a multiweek period), the ultimate goal did not. And as always, the conference yielded a range of new disclosures, proof points, and case studies smack-dab in the wheelhouse of today's CAD professionals. Most notably, mined from this year's event are more than a few nuggets of particular interest to CAD in the areas of 3D rendering, virtual reality, and graphics processing unit (GPU)-accelerated computing.
Rendering for CAD
For all intents and purposes, rendering today equates to ray tracing. By emulating the true physical properties of lights and materials, ray tracing yields photorealism that conventional raster-based 3D graphics can't achieve (at least, not without a lot of cumbersome and time-consuming optimizations to approximate global illumination).
When it comes to photorealism, raster graphics can't match a physically based rendering technology like ray tracing. Image source: Autodesk.
Now, all ray-traced rendering is not created equal, as implementations can vary a bit algorithmically and by level of detail. But the biggest differentiator today is not how it's computed, but where: on the CPU exclusively, or on the GPU (primarily with some CPU handholding). The two camps have historically been separated by usage, both in terms of quality and throughput: GPU-accelerated for real-time, interactive use that delivers very good quality, and CPU for single-render, multi-playback use that promises the best possible quality (i.e., "final frame" studio film caliber).
Today, the line between those two camps remains generally the same, but the gap between them is diminishing. Because while performance on the CPU side has improved from generation to generation, those incremental gains have paled in comparison to what GPUs have managed to achieve in both performance and quality, particularly with the 2018 introduction of Nvidia's RTX technology. First seeded in the company's Volta generation and later unveiled in full fruition with the more broadly marketed Turing GPU, RTX is specifically architected to speed up ray-traced rendering, primarily via two means: accelerated ray-casting (via dedicated hardware RT Cores) and image denoising (via machine learning accelerated on hardware Tensor Cores). (Read more about Turing and RTX here.) While "real-time performance" remains subjective (precisely how many frames per second are we talking, and at what level of detail, resolution, and quality?), I think it's fair to accept Nvidia's argument that RTX GPUs are the first that can legitimately claim to offer real-time ray-tracing.
The pace of ray-tracing performance on GPUs has increased significantly with NVIDIA RTX technology. Image source: Autodesk, based on OptiX performance.
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Alex Herrera is a consultant focusing on high-performance graphics and workstations.
Photorealistic visualizations can boost customer confidence and help close deals, but they have traditionally been the domain of architects or home designers equipped with complex software and expensive hardware. Now, home builders without 3D design skills can create 3D renderings in a web browser instead.
In every industry, customers want to be sure of what they're buying — and doubly so when there's a lot of money changing hands. Selling a tangible product is challenging enough, but when builders meet with potential customers, they're trying to get buy-in on something that doesn't exist. No matter how attractive the new building or interior design scheme may be, it's just a concept, not a reality; the customer can't caress the countertops or walk into the walk-in closets. Therefore, images that relate the proposed project in precise detail are an essential sales tool.
While blueprints work for some aspects of project planning, they fall short in this role, because most people struggle to translate 2D drawings into 3D experiences. That disconnect creates problems on the front end, slowing the decision-making process and sometimes losing sales when customers won't commit to what they can't see. Down the line, it can also disrupt the building process: Discrepancies between the customer's and builder's understanding of the project can force midstream revisions and even delay final delivery.
In addition, the creation of those blueprints for customers adds financial and scheduling burdens to the sales process. In the United States, more than two-thirds of home-building companies have to pay to outsource blueprint production to architectural firms. The remainder can turn to in-house architectural teams equipped with CAD software, but it can still take up to a month to deliver a blueprint to a customer.
The Benefits of Communicating in 3D
All these factors make 3D renderings a more appealing — and effective — client communication tool than even the most detailed blueprints. Providing richer detail than 2D floor plans, they can help customers evaluate how much morning light will reach the master bedroom, decide whether the kitchen workflow is efficient, and see if the dining room looks cramped or spacious. Realistic elements including lighting, finishes, and furnishings encourage viewers to connect with a space emotionally and envision themselves occupying it — which is especially important for home buyers.
But for years, the creation of 3D visualizations was a slow process, and restricted to those with equipped with rendering expertise, specialized software, and high-powered workstations. Those limitations are especially problematic for sales teams, who need to deliver project proposals quickly to beat their competitors, and rarely have the design experience or dedicated technology needed to prepare renderings by traditional means.
Today, the cloud's processing power enables a different technology solution altogether. Those looking to create photorealistic detailed images can now do so through a web browser interface, so a powerful desktop computer isn't necessary. And on the software side, streamlined commands and the capability to automatically create 3D representations from 2D drawings mean that users don't need extensive design experience.
Epic Games Updates Twinmotion Architectural Visualization Tool
Twinmotion 2020 features new lighting, material, and vegetation options that yield more realistic renderings for communication with project stakeholders, according to the developer. Read more »
AutoCAD 2021 Helps Users Travel into the Past — of Their Drawings
The latest release of Autodesk's venerable 2D/3D CAD software application includes a Drawing History feature, which enables users to see how their drawings have changed over time, and compare past versions with the present. Read more »
CAD Manager Column: Autodesk Delays Implementation of Named-User Licensing Approach
The move from perpetual or floating network licenses to named-user licenses has been pushed out to August 2020, but it is definitely still happening — so CAD managers need to prepare now. Read more »
Herrera on Hardware: The Traditional Computer Memory Hierarchy — and Its Impact on CAD Performance — Are Evolving
The basic tenets of the tried-and-true memory hierarchy still apply, but it's worth knowing how recent disruptors can improve performance — and perhaps shift the traditional balance. Read more »