MAK's visualization suite lets you see beautiful visuals from every vantage point.
MAK lets you visualize your simulation environment from multiple perspectives. These visual simulation perspectives range from realistic scenes for out-the-window or sensor views for man-in-the-loop training devices, to 3D informational views and tactical map displays for planning and situational awareness, and also to exaggerated reality views for visual analysis.
High-quality, high-performance human character visualization lets you include people and animals into your visual simulation applications.
This product is temporarily out of stock. You may opt-in our waiting list and we'll notify you once the product is restocked.
Visualize Product Overview
The VR-Vantage line of product is built on an Open-Architecture
VR-Vantage's open architecture design offers the most flexible high-performance visualization tools on the market. The APIs make it possible to accomodate any Training, Simulation or Mission Rehearsal requirements. This approach also supports compatibility with your current and existing applications.
VR-Vantage IG – Realistic Image Generation
VR-Vantage Stealth – Visualize Information
SensorFX – Physically Accurate Sensor Image Generation
RadarFX SAR Server – Synthetic Aperture Radar Simulation
Have a question?
Immersive Visual and Sensor Image Generation
- CIGI, DIS, and HLA interfaces
- 60Hz for Multi-Channel Display Systems, 90Hz for AR-VR
- Built on Open Standards, Customized with API and Plugins
Performance-Driven, Multi-Channel IG + Sensor Views
With VR-Vantage IG’s special effects and physics-based models, participants immerse in their synthetic world and reap the full benefits of each session. Realistic out-the-window (OTW) scenes, physics- and effects-based sensor views that can be deployed on a variety of COTS hardware configurations from simple desktops to multi-channel displays for virtual cockpits, monitor-based training systems and AR/VR components.
VR-Vantage IG’s built-in distributed rendering architecture supports large, multichannel, “situation room” style displays. An intuitive GUI allows you to connect to remote display engines running on additional PCs to increase your field of view.
- CIGI, DIS, HLA
- Distortion correction
- Realistic, animated lifeforms with cultural diversity
- Extensive library of high-quality air, land and maritime moving entity models
- Dynamic structures; damage on buildings and bridges
- Optical Sensor Models (EO/IR/NVG) included
- Optical Sensor Physics (EO/IR/NVG) with SensorFX
- Whole-Earth Procedural Terrain
- CDB, OpenFlight, and many other formats
- Dynamic terrain: damage on buildings, bridges and craters
- Terrain blending options
- Heads-up display (HUD) support
- Sensor overlays
System / Performance
- Open Architecture
- Commercial-off-the-shelf (COTS) Software
- Scalable/manageable performance
- Supports AR / VR / MR
Simulated Environmental Effects
- Dynamic weather: thunderstorms, rain, snow, clouds
- Dynamic ocean: waves, surf, wakes, shallow water, and buoyancy (responds to dynamic weather)
- Particle-based recirculation and downwash effects
- Rotor wash
- Particle-based weather systems
- Procedural ground textures
- Smoke and flames
- Trailing effects, such as footprints, wakes, missile trails, and dust clouds
- Muzzle flashes
- Realistic, dense vegetation
- High Dynamic Range (HDR)
- Light Points & Light Lobes
- Dynamic lighting and shadows
- Ephemeris model
- Ocean planar reflection and shadows
- Crepuscular (sun) rays & lens flare
- Atmospheric scattering
- Fresnel lighting for scene reflections
3D and 2D Informative Battlefield Visualization
VR-Vantage Stealth is MAK’s information station where you can view the virtual world in 2D and 3D. Whether you need it for situational awareness, simulation debugging, or after action review, VR-Vantage Stealth provides the most data about your networked virtual world and presents it in a clear and accessible way. With VR-Vantage Stealth, you can quickly achieve a “big picture” understanding of a battlefield situation while retaining an immersive sense of perspective.
Focused on Information
VR-Vantage Stealth visually presents a wide variety of information about your virtual environment. You can simultaneously view the virtual world in 2D and 3D, with configurable information overlays presented in both displays. VR-Vantage Stealth can also draw 3D representations of tactical graphics from VR-Forces, such as waypoints, routes, and areas. Picture-in-picture inset views allow you to see what any vehicle is seeing, even as you watch it travel across the terrain. Drop simulated cameras anywhere in the world to zoom in on multiple parts of the battle at once. Switch any visual channel from 3D to 2D mode with a single click. Or use a 2D inset to help with navigation as you fly your 3D eyepoint around the battlefield in the main display. VR-Vantage Stealth lets you watch entity-level engagements without losing the command-level view.
Navigation is easy in VR-Vantage Stealth, whether you’re flying freely over the terrain, following an entity as it moves through the synthetic environment, or tracking a missile from a fixed vantage point. We’ve combined the best elements of first-person-shooters, real-time-strategy games, and “spin the earth” virtual globe applications, to provide an interface that you’ll find familiar the moment you start using it. Navigate through the world by dragging the terrain, clicking on a destination, or maneuvering the eyepoint using familiar first-person-shooter controls. You can orbit around buildings or props, follow or track moving entities, jump in the cockpit for an out-the-window view, or mount your virtual camera on a vehicle. While navigating, you can save a list of your favorite views for rapid retrieval.
Easily Load Your Terrain VR-Vantage is Terrain Agile — able to work with a wide variety of terrain approaches, formats, and protocols. The tool can load traditional databases, like handmodeled OpenFlight, page large-area terrains, like MetaFlight, and build visual databases “on-the-fly” from source data like DTED, GeoTIFF, and Shapefiles. It can even dynamically create 3D terrain by streaming in elevation, imagery, and features from terrain servers, like MAK’s VR-TheWorld Server, to build up large areas and by cutting-in site models for high fidelity ground detail. With VR-Vantage Stealth, you can load site models with dense urban buildup and thick vegetation as well as tiled terrain databases that page in over large areas. You can extract buildings and other models from the terrain and manipulate them directly in the synthetic environment. To see human characters interacting inside, you can manipulate specific buildings to make them semi-transparent.
Built-in Content VR
Vantage Stealth comes with a rich set of top-quality 3D entity models that support attached parts, damage representations, and articulated parts such as turrets and guns. Built-in support for DIGuy™, DiSTI’s GL Studio®, IDV’s SpeedTree®, and Sundog’s SilverLining™, and Triton’s 3D Dynamic Ocean and Water means that you don’t need to integrate and configure extra modules, or buy additional run-time licenses to have great looking human characters, interactive cockpit displays, dynamic trees and bushes, weather effects, and volumetric clouds. And HLA and DIS support through MAK’s own VR-Link networking toolkit is included, so that interoperability is a given, not an add-on.
- SIMULTANEOUS 2D AND 3D VISUALIZATION DISPLAYS
- VIEW ENTITIES, AGGREGATES, TRAJECTORIES, SENSORS, AND MORE
- GAME-LIKE NAVIGATION
- EASY MODEL INTEGRATION
- LOADED WITH CONTENT
- TERRAIN AGILE
- BUILT-IN SUPPORT FOR DIS AND HLA
- GEOCENTRIC OR PROJECTED TERRAINS
- STREAMING TERRAIN DATABASES
- EXAGGERATED REALITY VIEW FOR ENHANCED SITUATIONAL AWARENESS
- CUSTOMIZABLE THROUGH VR-VANTAGE TOOLKIT
- BUILT IN SUPPORT FOR STREAMING VIDEO
- COMMAND AND CONTROL (C2) INFORMATIVE VISUALIZATION
- INSTRUCTOR OPERATOR STATIONS
- AFTER ACTION REVIEW
- SITUATIONAL AWARENESS
- PROJECT DEMONSTRATION
- SIMULATION DEBUGGING
- SIMULATION ANALYSIS
- COMMON OPERATING PICTURE
Physically Accurate Sensors
SensorFX is a physics-based plug-in that accurately models light energy as it is reflected and emitted from surfaces in the scene while transmitted through the atmosphere and into a sensing device. It also models the collection and processing properties of the sensing device to render an accurate electro-optical (EO), night vision, or infrared (IR) scene.
Extensive Sensor Coverage
Validated for the US military and delivered around the world for high fidelity sensor training and research applications, SensorFX enables you to credibly simulate any sensor in the 0.2-25.0um band with VR-Vantage, including:
- FLIRs / Thermal Imagers
- Image Intensifiers / NVGs
- EO Cameras: Color CCD, LLTV, BW, SWIR, LWIR, MWIR, White hot and Black hot
Credible Sensor Phenomenology
- True physical property assignments throughout the 3D environment: vehicles, terrain, cultural features and atmosphere
- On-the-fly, real-time prediction engine for spectral signatures & atmospherics –not pre-computed tables
- Fast, full-transient thermal models running in real-time for accurate, time and angle-dependent temperatures, reactive to changes in weather and dynamic states
- Innovative, real-time, fully validated environment model
- High dynamic range 2D radiance renderings. VR-Forces’ scripted tasks enable users with only basic programming skills to quickly develop complex tasks, easily coordinate group behaviors, and script GUI components in minutes.
SensorFX delivers a complete solution for the deployment of sensor-enabled VR-Vantage applications including:
- Easy-to-use material classification tool
- Extensive library of materially classified 3D models, human characters, and sample terrains
- Material classified 3D Vehicle Set
- Material classified streaming terrains
- Modify sensor parameters to represent your specific sensors
SensorFX features a GPU-based real-time image processor that applies engineering-level sensor effects. Models the optics, detector FPA, white noise (NET), 1/f noise, I2 (NEI), signal processing, display, AGC, gain/level, light-point haloing, etc
Options for the development and customization of sensor enabled VR-Vantage applications are also available including a Sensor Development Toolkit, as well as a semi-automated material classification tool, GenesisMC.
RadarFX SAR Server
Synthetic Aperture Radar Simulation
RadarFX is a highly accurate radar image generator. It combines proven physics-based signature simulation with VR-Vantage’s advanced rendering architecture.
RadarFX produces fully-correlated realistic imagery using radar returns from the terrain and radar cross-sections from vehicles, munitions, and natural and cultural features. The results are served to your applications to be presented to users in the proper training context, such as Surveillance, Reconnaissance, and Targeting.
Our server supports Strip SAR, in which the track direction is linear, Spot SAR, in which the track direction rotates around a fixed point on the ground, and inverse synthetic aperture radar (ISAR), which uses the movement of a target entity to generate a radar image instead of the movement of the emitter (as is done by SAR).
- SAR Shadows
- Leading edge brightness
- Down-range / Cross-range resolution effects
- RF path attenuation, atmospheric scattering, and Absorption Noise
- Target radar cross-sections from imported RCS or FIELD files
- Component scattering and coherent summation using Jones Matrix (for resonances and nulls) Polarization
- Terrain areal RCS from Ulaby-Dobson parameters embedded in spectral material property files
- Choice of gain distribution type and directivities, for transmitter and receiver separately
- Sensor system noise as function of bandwidth and temperature
- Frequency (GHz)
- Pulse width (u-secs)
- Pulse repetition frequency - PRF (Hz)
- Transmitter polarization angle
- Receiver directivity
- Receiver polarization angle
- Integration path length (m)
- Transmitter power (W) and max power (W)
- Antenna gain
- System temperature
- Saturation S/N ration
- Gain pattern
- Display type