GIS Tech News #1220 Dec, 2005 By: Kenneth Wong
Flight Plan for Mach 1
It's good that Chris Henderson is not susceptible to motion sickness. He is heavily involved in developing flight-simulation programs that can realistically imitate the speed of an assortment of planes. Chances are he may never have to climb into the cockpit of an actual plane, but if ever called to do that, he is fully prepared: He has been tossed 238ft into the sky at 70mph on Mr. Freeze, the famous roller coaster at Six Flags over Texas.
Henderson is director of engineering at MultiGen-Paradigm, a developer of 3D visualization technology. The company helps create geospatial databases, database-generation systems and image-generation systems for training pilots. These systems are often used in flight emulators, combination hardware-software platforms that teach pilots how to make tactical maneuvers and deploy weapons.
Rebuilding the World
To develop these tools, the typical first order of business is to digitally recreate the designated training locations based on geospatial data and satellite imagery. Simply put, it is to drape high-resolution 2D aerial photography and satellite imagery over a 3D wireframe model of the terrain so a user can interactively inspect the location from different angles. In recent years, powerful CPUs and graphics cards have enabled video-game developers to do just that, especially in games such as Microsoft Flight Simulator. But what differentiates the virtual environment in video games from the one MultiGen-Paradigm is creating, according to Henderson, is his company’s proprietary technology. “Instead of having a team of, say, 50 artists model an airspace in minute detail, we use our own computer program and software tools to take the terrain elevation data and algorithmically generate the model,” he says.
Stitching Up the Landscape
MultiGen-Paradigm?s geospatial data comes from a number of sources, such as IKONOS, Quickbird, Landsat 7 and SPOT. Some imagery is acquired from local aerial photography vendors. Most is acquired during the favorable summer season, but not necessarily at the same time -- in some cases separated by a year or two, leading to "registration problems." Henderson explains, "For instance, construction might have taken place and it might show up in one image, but not in an earlier one. So we look at them and consult the customers to see if they are acceptable; if not, we order new imageries." Content providers, such as Digital Globe, work with Henderson’s team to balance color, match tone, mosaic images and align satellite photos with known GIS data.
Thanks to the company’s specialized rendering system, Henderson and his team can view the terrain as crisp images on their monitors. But how will the same content look when displayed on a client’s hardware? “Our own image generators are running on a different hardware platform from what’s used by some of our clients,” says Henderson. “It takes a lot of system and level-design work between teams to make sure that when the [geospatial data] is published, correlated subsystems display the same synthetic world with the same level of accuracy.”
The technology working behind the scene is MultiGen-Paradigm?s patented Virtual Texture technology. "The difficulty with large-area geospecific visualization is that the gigabytes of image data used by the visualization by far exceeds available computer memory and video resources," says Sandeep Divekar, MultiGen-Paradigm president and CEO. Virtual Texture, according to the company, enables users to integrate geospecific imagery with other resources such as normal displacement maps for improved terrain shading, multiple representations for different sensor spectral bands, thermal data for infrared simulation and detailed texture for enhancing image quality in the areas where high-resolution satellite imagery is unavailable.
Speeding Through the Details
The source data MultiGen-Paradigm uses comes at 1 meter per pixel. At this resolution, rocks, shrubs and dirt paths are clearly visible to a pilot cruising above the airfield. But once the pilot launches into high speed, these details must, in conformity to realism, become swift-moving images. To achieve this effect, Henderson and his team use a method commonly deployed in video games: LOD (level of detail) management. “Only large, significant scene elements are drawn for long distances,” says Henderson. “Smaller objects, like rocks and shrubs, are removed from the scene at a fairly short distance away from the eye sight. So distance, space and details get resolved in the view as they would in the real world.”
Deforming the Future
The increased horsepower and affordability of hardware are certainly factors driving the visualization market. So is the easy availability of high-quality geospatial data, according to Henderson. “The ability to acquire new imagery is pretty amazing,” he says. “We can order new areas of coverage from our provider, they’ll issue an order to the satellite and we get them within months.” According to Henderson, future flight emulators will do more than simply display navigable virtual worlds. They will be capable of performing the more CPU-intensive operations, such as letting pilots interact with reactive terrains. A user can, for instance, detonate explosives and see the impact immediately. Improved computational models for simulating weather conditions, such as light, cloud and fog, will give these virtual worlds unprecedented fidelity.