Seriously training for warfighting today: from rendering planets to blades of grass, rapid terrain generation for fair fights across your Military “Enterprise”

What do the Kerbal Space Program and Google Earth have in common? Both are capable of rendering entire planets from space down to ground level using techniques that are now commonplace, such as “cube mapping” and “continuous level of detail”. Massive leaps have been made when it comes to rendering entire planets for either entertainment or commercial purposes. 

While many industries leverage realistic renderings of the Earth, government users have unique and demanding requirements. For example, military exercises commonly involve many different systems, all of which depict real-world terrain at different fidelities and dimensions. Different virtual simulators (e.g., flight and tank simulators) are integrated with computer-generated forces and also connected to mission command systems that might be tracking both synthetic actors and live trainees (i.e., conducting operations in the real world). 

All of these simulators and systems rely on a common view of the terrain, and this presents a distinct challenge: how to ensure that all of these different 2D and 3D views correlate, down to individual trees and rocks on the terrain? 

Consider two distinct simulators - a helicopter simulator built by a systems integrator and a game-based infantry trainer (e.g., based on Unreal or Unity). The 3D rendering engines in these two simulators will be very different, and while the source terrain data that underpins the terrain might be the same, at ground level there are going to be major differences. For example, Unreal and Unity can provide amazing ground detail through the procedural placement of additional terrain features, but the image generator used by the helicopter might not support procedural augmentation. This leads to obvious “fair fight” issues where objects that exist in one simulator don’t exist in the other. Over the years, software companies have strived to solve these issues by providing tools that output “correlated” terrain across different applications, but this takes time, and these tools have traditionally required significant training to use. 

Games like Microsoft Flight Simulator, which stream all of their terrain from central servers, have recently driven a step-change by providing high-fidelity and geo-specific data on demand, via cloud technologies. Streaming terrain is supported by Unreal and Unity (typically through third-party plugins), and, obviously, it’s a main feature of web apps like Google Maps. Considering the needs of the military, the ideal solution would be to stream the entire planet to all of the military applications and simulators, correlated and in real-world detail, on-demand. This is now possible through Mantle, a new platform from Bohemia Interactive Simulations (BISim). Mantle provides a custom terrain pipeline that is adapted to the preferred terrain inputs (e.g. terrain data in open formats or higher-fidelity 3D data from Maxar or and endpoints (e.g., games built in Unreal or Unity, or high-end and accredited image generators that underpin typical flight simulators). It provides an easy-to-use, web-based user interface to manage terrain layers, and real-time 3D terrain editing is also supported.

Terrain is the foundation of military simulation. Like in real-world operations, the accuracy and fidelity of terrain data in simulation can influence outcomes. In today’s dramatically evolving combat scenarios, learning new lessons from Ukraine (as well as relearning old lessons), if you’re training seriously, the right terrain is needed at the highest level of detail now, not in weeks or months. And it’s needed for all users, for those rehearsing (and training) at unit level and in combined arms. It’s needed in all military systems with minimal administrative burden. Mantle does that now, setting the new benchmark for terrain generation and management across your Military Enterprise.

To learn more about BISim, visit