Tech Trends-Peek into a Nuclear Weapons Lab

31 Jan, 2006 By: Kenneth Wong

Fakespace builds Stereoscopic Visualization Center for Los Alamos National Laboratory.

Every Once In a while, a team of scientists from Los Alamos National Laboratory, aka the Lab, is herded into what is known as the Cave, not too far from the White Rock hiking trails. They wear funny plastic goggles and don dainty little booties over their shoes. Once inside, they wait for an explosion to take place under their feet as they sip coffee.

The explosion is a computational simulation. The room, nicknamed La Cueva Grande (the Great Cave), is a 15' X 10' X 12' (WxLxH) immersive visualization center, built and installed with the help of Fakespace Systems ( Researchers involved in the ASC (Advanced Simulation and Computing) Program at the Lab use the immersive room to analyze the massive amount of data generated by the U.S. Nuclear Weapons Stockpile Stewardship Program. According to the Lab, this is at the cutting edge of an emerging third scientific method, or predictive science, which complements the classic methods of theory and experiment. Advances pioneered at the National Laboratories (Lawrence Livermore, Sandia and Los Alamos) on behalf of the U.S. Stockpile Stewardship Program will ultimately lead to adaptation in many scientific, engineering and commercial applications.

Step Inside

Steve Stringer, project leader at the Lab, modestly describes himself as "the soup-to-nuts guy." He explains the concept of the immersive room, borrowing an analogy from his colleague Bob Kares, a senior visualization scientist at the Lab. "When you're looking at a flat screen," says Stringer, "it's like looking through a picture window. You can tilt your head, you can look around, but you're not in the scene. When you go into the immersive room, it's like stepping through the window and into the world itself."

There are 33 projectors behind the wall, ceiling and floor screens, spraying out 43 million pixels into the room. Raw physics dance above, below and all around you as polygons and meshes (figure 1). Motion sensors track your movement, and the projectors flash the left-eye and the right-eye views in quick succession to create the stereoscopic depth effect. Jeff Brum, vice-president of business development and marketing at Fakespace, says, "Because you are wearing stereoscopic glasses, you do actually get the sense that images are coming off the screens and surrounding you just like a real-world environment. In a simulation like the ones at Los Alamos, you feel as if you are standing inside a nuclear reaction, an experience not possible any other way. For an automotive designer, the same kind of immersive display can be used, so that if you're sitting in the model of a vehicle, you feel that the steering wheel is actually in front of you and the armrest is on the side."

 Figure 1. A view of the immersive environment inside the visualization center installed at Los Alamos National Laboratory.
Figure 1. A view of the immersive environment inside the visualization center installed at Los Alamos National Laboratory.

Networking for the Cave

A single simulation session can produce as much as 650TB (terabytes) of data. To give a sense of scale, Stringer compares it to the entire collection of printed materials in the Library of Congress, which, he estimates, amounts to about 20TB. The visualization centers in the three National Laboratories are linked via secure broadband pipelines for remote collaboration. But 650TB is liable to choke any pipeline. "So we won't transfer that amount of data routinely," explains Stringer. "The company [Computational Engineering International] that supplies our visualization software [EnSight] has extended its client-server for us, and has made it commercially available for anyone to use. So if the data is hosted in California, they use a hierarchy of servers—basically servers of servers—to roll up the rendered data, not the raw computed data, and ship that across the country to another lab."

Collaboration inside the immersive room is the classic kind, where a team of experts comes together to solve a problem in the same physical space. Remote collaboration is technically possible, but according to Stringer, that working model may not be relevant to how the three National Laboratories operate, because each Lab's specific tasks, while complementary, are different from the others'.

A Pixel Makes a Difference

"The entire design was driven by the size of a pixel," says Stringer. Based on their experience with the installation of previous visualization systems (many from Fakespace), Stringer and the Lab staff had determined that they wanted the screen resolution to be 21 pixels per inch. The space designated for the project—a 30' cube—could barely accommodate the equipment required to produce this resolution and the space needed for wires, electrical works, fire suppression equipment and structural support (figures 2 and 3).

Figures 2 and 3. These two diagrams of the immersive room reveal the design challenge involved in configuring all the necessary equipment to fit into the designated area.
Figures 2 and 3. These two diagrams of the immersive room reveal the design challenge involved in configuring all the necessary equipment to fit into the designated area.

Digitally lighting the five walls using 33 different projectors means aligning the images to the finest possible degree. If the tiled displays from multiple sources aren't consolidated properly, the stereoscopic illusion is lost. "In building design, if what you build is within an inch or two of the design, it's considered excellent," says Stringer. "But with the immersive room, we needed the actual room to be accurate within 1/30 of an inch."

It's Alive!

A large portion of Fakespace's customer base is made up of automotive, aerospace and mechanical system manufacturers. Many of them, according to Brum, use his company's visualization systems to reduce the time and cost spent on physical prototyping. "We are trying to enable more CAD applications to work within the virtual environment," says Brum. "We developed a middleware graphics distribution system called Conduit. This software makes it possible for CAD systems like CATIA to be used natively within an immersive environment. What we do is intercept the geometry of the model before it gets rendered, then render it across computing clusters for the purpose of enabling immersive reality. There's no data porting—you are working with live data in real time." Changes can be made to the model using a mouse, a special wand, a Tablet PC and a digital glove such as the Pinch, available from Fakespace.

Dissecting the Immersive Room
Dissecting the Immersive Room

Project leader Stringer recently had an opportunity to use his considerable skills for putting together complex structures in a more dramatic setting; shifting into reverse gear, he helped dismantle The Nutcracker Ballet set used in his daughter's school performance.

Kenneth Wong is a former editor of Cadence magazine. As a freelance writer, he explores innovative use of technology and its implications. E-mail him at

About the Author: Kenneth Wong

More News and Resources from Cadalyst Partners

For Mold Designers! Cadalyst has an area of our site focused on technologies and resources specific to the mold design professional. Sponsored by Siemens NX.  Visit the Equipped Mold Designer here!

For Architects! Cadalyst has an area of our site focused on technologies and resources specific to the building design professional. Sponsored by HP.  Visit the Equipped Architect here!