The Metaverse: Separating the Wheat from the Chaff, from a CAD Perspective16 Dec, 2021 By: Alex Herrera
Herrera on Hardware: Much of the early conceptual pitches might feel like marketing fluff, but the metaverse will hold some real value, particularly in CAD-centric computing.
It's feeling like we’re in another hype and bust cycle. Marketing megaphones are on, and the message is blaring: The metaverse is coming, and it will be transformative. Get on board or get left behind. Facebook turned the volume up considerably with the change of its name to “meta,” accompanied with an advertising push aimed not just at tech-oriented audiences but blanketing mainstream channels as well. The company’s “Tiger and Buffalo” ad seems to be everywhere, promising how “This is going to be fun” (though it doesn’t quite make clear how).
Justifying its rebranding, Meta (formerly Facebook) clearly judges the metaverse as the next big thing. Image source: Meta.
It sure seems like we’re approaching that tipping point we’ve witnessed more than a few times in the past, with an industry fostering an overly enthusiastic vision in efforts to build awareness and ultimately grow its markets. Sometimes there’s enough substance to eventually deliver on the hype, but often the exuberance spirals so far out of whack, some degree of disappointment is inevitable.
The hype has not only arrived but is arguably being set up to fail, at least if failing is defined as falling short of clearing the lofty bar that’s being set. Count me as one who’s more than skeptical that the metaverse will achieve on expectations — at least in the broad sense, and including both those explicitly define as well as those being conjured in the minds of consumers and businesses alike. No doubt, there is proven value of a shared immersive 3D world in gaming, and clearly meta envisions a social media world evolving directly into its metaverse. Still, the possibility remains that it’s ultimately perceived down the road as a flash in the pan, an afterthought or even a punchline, simply because proponents over promised and under delivered.
And if so, that’ll be a shame, because — also like other hype cycles before it — the failure to clear an unrealistic bar will have overshadowed some real value in the underlying concepts and technologies. At least in the context of CAD-centric industries like AEC, design, and manufacturing, compelling uses extend beyond gimmick to offer real utility, value that dovetails with other emerging trends to offer some interesting technology synergies for businesses relying on 3D visual computing.
Plumbing the Metaverse for AEC, Design, and Manufacturing Worth
This column has touched on the idea of the metaverse before, in the context of NVIDIA’s Omniverse, now off the drawing board and out in use, though with an ecosystem still being fleshed out. Check out the previous dive into Omniverse here, and click over to NVIDIA’s site for more on the latest updates on partners and tools supporting Omniverse. The focus here is not Omniverse specifically — though it currently represents the most mature example — but rather on what the model of a metaverse tailored specifically to 3D visual computing can offer the CAD community.
Thanks to growing mainstream marketing onslaught, the metaverse might conjure up a vision of avatars taking our place, roaming through some unknown and unnatural world. What do you do in that world that you wouldn't do in the real world? In gaming, the answer is both obvious and proven: the metaverse creates a fantasy world that can be shared virtually. Social media seems like a fit, though it’s debatable how much of that community will end up adopting it as a must-have norm instead of a curious gimmick.
In industries like AEC, design, and manufacturing, though, the metaverse presents an opportunity to sensibly improve on the operations and workflows supporting not only the projects to create — that have always represented the crux of CAD computing — but in the support and maintenance of those projects long after creation. Yes, metaverse computing environments will assuredly offer virtual interactivity between users via avatars, with or without accessories like VR headsets and haptic feedback gloves. But for many, those functions will be judged the chaff, not the wheat.
Instead, for those of us looking to hone in on concrete benefits, I’d focus on the metaverse as a shared 3D rendered workspace — accurately representing the physical world to come — where many contributors can create, view, and analyze the same model, regardless of where they are. For large-scale projects incorporating contributors from many disciplines across many locations, that’s a vision that directly addresses one of today’s most daunting IT challenges. And, before you dismiss the value as limited to big multi-national architectural or aerospace firms, remember that when the pandemic pushed everyone out of their offices and into their homes, virtually all businesses became “physically scattered.”
How an AEC desktop view might look with Omniverse. Image source: NVIDIA.
The Metaverse as a Digital Twin, Precisely Tracking the Physical World Over a Complete Life Cycle
Think about the advantage of developing a 3D digital model, and thoughts likely focus only on the creation aspect: visualizing, testing, and proving a design in the virtual realm before committing to the physical representation, with the goal of avoiding costly rework in the real world. But an emerging use of that physically accurate digital model — one that can go much further to avoid time and dollar expenses down the road — is in leveraging that model as a digital twin, one that rides shotgun with its physical sibling over its entire useful life.
Often, after turning that carefully designed, simulated, and rendered final model into blueprints or manufacturing files to construct or fabricate, the model is archived for a future design revision or maybe even tossed in the digital trash heap. Instead, imagine what’s possible if we held onto that physically accurate model and sustained it over its physical incarnation’s lifetime, precisely tracking, simulating, and predicting how it ages. If we could rely on an accurate digital twin, think about all the uses you could find for viewing, inspection, and analysis in the virtual world, uses that would otherwise be prohibitively expensive, time-consuming, or simply impossible in the real world.
That’s the motivation behind digital twins, an approach well suited to the metaverse model. Proofs of concept already exist, with more case studies emerging from companies like Ericsson and Siemens. Leaning on Omniverse, Ericsson created a digital twin of an entire city to determine the optimal deployment for their 5G rollout, using Omniverse to judiciously select and configure sites to maximize coverage and signal quality with minimal cost and complexity. To do so, Ericsson’s digital twin city tracks the physical world, with buildings and foliage 100% accurate, down to their surface materials and texture. In that world, Ericsson can accurately assess signal quality at any point in the city with a given deployment. And re-working sites— either upfront or at a later date for upgrade — can easily reveal how the coverage would improve (or not) in real time.
In a physically accurate metaverse, Ericsson determined the optimal deployment for their 5G rollout. Image source: Ericsson/NVIDIA.
Siemens leveraged the metaverse to model and simulate a digital twin of its steam turbine. Turbine failures are obviously a costly event to avoid, but so are unwarranted shutdowns to inspect and maintain the system. Ideally, the turbine is shut down with minimal frequency, only when necessary to replace parts expected of imminent failure. Simulating in parallel the complex process of corrosion over time, Siemens can both minimize shutdowns required to inspect the internal integrity, as well as help predict failures. All told, Siemens says the digital twin has reduced downtime by 70%, saving the industry $1.7B.
Siemens is using Omniverse to minimize costly steam turbine shutdowns. Image source: Siemens/NVIDIA.
Natural Synergies: Datacenter Workstations, Accelerated Simulations, and Machine Learning
Better yet, in the case of industries in AEC, design, and manufacturing, the metaverse isn’t some stand-alone solution that requires unique IT build-out that serves no other ends. On the contrary, the metaverse both depends and builds upon complementary approaches that yield benefits beyond the sum of its parts.
First and foremost, among those synergies is metaverse’s reliance on a centralized computing topology, where data, computation, and visualization are all co-located in a datacenter. To function effectively, the metaverse — represented by a single, potentially gargantuan dataset that is shared among many, geographically scattered users — the three components have to be co-located. That datacenter can be owned or rented in the cloud, and to accomplish that co-location, implies the use of datacenter clients, hosted either virtual or physically. We’ve covered the concept of remote, datacenter hosted workstations several times in the past (including this deep dive multi-part series) and the benefits offered independent of the metaverse: 24/7 access from anywhere, big-data management and security, collaboration, disaster mitigation, and (for some) even financial advantages. The metaverse fits neatly on top of this existing IT architecture, and at should leverage the infrastructure already in place (though also may motivate
A recent adopter of Omniverse, OutdoorLiving3D’s business model comports perfectly with a centralized model hosting a shared 3D rendered world. The company produces max-fidelity exterior imagery and animations of projects envisioned by architectural clients. The “exterior” description is key, as accurate views depend not just on the building exterior but a potentially complex virtual urban or natural landscape beyond, something that could only be delivered with an accurate representation of the surrounding world (or at least the proximate portion of it). Furthermore, like many ventures, OutdoorLiving3D found themselves with these projects while sending staff back to home offices at the onset of the pandemic. The metaverse’s combination of anytime-anywhere design, coupled with a shared model that’s presented in full 3D rendered quality, provided an ideal computing environment.
Omniverse’s centralized, rendered, shared 3D workspace worked well for OutdoorLiving3D, especially when the pandemic hit. Image source: OutdoorLiving3D.
Natural Synergy with Existing Datacenter and GPU-accelerated CAD Workloads
With the datacenter as its computing foundation, the metaverse dovetails neatly with the growing cloud-fueled trend to push more of our computational heavy lifting back to the datacenter. And some of the most compelling examples tie directly to common workloads in CAD. Consider rendering, engineering simulations, and the exploding use of machine learning for things like generative design.
Rendering has always been a popular task to off-load to servers, and now that rendering server — or even multi-server farm — is local to metaverse data and users’ desktops. The same applies for engineering simulations that demand more CPU and GPU cycles than your deskbound workstation client can deliver. Applications in design and engineering abound, for example FEA and CFD, particularly when extended, fine-grained analysis of complex objects and environmental parameters are essential.
GPU-accelerated engineering simulations have already been the domain of the datacenter, tying in nicely with the metaverse’s centralized computing model. Image source: NVIDIA.
And, it’s the datacenter that represents the highest-performance domain for machine learning as well, especially for the training of deep neural networks. That Siemens example referenced earlier, running physics simulations and what-if scenarios to predict possible failures and optimize service items and schedule? It uses a trained AI model that incorporates the physical twin’s actual, evolving condition, updated via human inspection or automated sensors (where possible).
Look Past the Hype for the Real Value
Gaming and social media aside, we can’t know if the metaverse will ever cross that chasm from a glorified infancy to mainstream ubiquity. No matter, because when it comes to professional 3D visual computing, we don’t really care about mainstream success; we care simply about whether it will offer some real computing utility or not, utility that gets us to our unchanging goal of better designs delivered in shorter times. And examples of that utility abound, and I’d argue particularly so in common workloads in design, manufacturing, and AEC.
Moreover, taking the plunge and adopting a metaverse computing environment doesn’t have to mean a departure from your IT strategy. Rather, its natural synergy with the growing trend to lean harder on the datacenter — reaping the benefits of remote virtual and physical workstations, compute acceleration, and machine learning — makes it a compelling step for those already on the centralized computing bandwagon. And for those that aren’t, it represents an even more persuasive argument to think about jumping on board.