Digital twins, simulation, synthetic data, and machine learning are being brought together in a sensor fusion project to enhance U.S. security and military training and operations. MS&T Special Correspondent Andy Fawkes reports.

A digital twin is now a commonplace term and is taking on more prominence in a more connected, sensor rich, and data-driven world.  A digital twin serves as the real-time digital counterpart of a physical object, process or person. When they are leveraged across the lifecycle, a digital thread of data and decisions can be created.  Further, as the digital twin can be freely manipulated and recorded in its virtual world, then large “synthetic” datasets of it and its environment can be computer generated. Such synthetic data can assist teaching a computer through machine learning how to react to certain situations or criteria, replacing previously expensive or inaccessible real-world-captured training data. Synthetic data of visual images is particularly common but it can be generated for other sources such as Lidar and combined for sensor fusion projects.

To explore the uses of digital twins and synthetic data we spoke to the SecureAmerica Institute (SAI), Amentum, and Unity about the progress they are making in developing a real-time modeling and simulation framework to model and predict multiple sensor fusion scenarios through machine learning. With the continued exponential growth of sensors and sensor fusion, the project is providing deep insights to their utility and the power of simulation more generally.  Although specifically in support of enhancing the resiliency of U.S. defense manufacturing, the project is also influencing wider technological advancements in military training and operations.

Customer Imperatives

The customer for the sensor fusion project is SAI, which is a private-public partnership “converging technology, economics and policy to enable a secure and resilient U.S. manufacturing and industrial base”.  SAI was originally stood up with a mission focused on cyber security but Darrell Wallace, SAI’s Deputy Director and Chief Technology Officer (CTO) told us: “our mission has evolved to be more broadly the security and resilience of the domestic American supply chain, particularly with emphasis on defense manufacturing”.  As part of the Texas Engineering Experiment Station (TEES), housed at Texas A&M University “we are actually a state agency, so we have a legislated mandate to serve the needs and the best interests of the state of Texas and the nation” Wallace continued. 

“Part of what we've been able to do is to support some seed research projects in technology areas and in activities that help move the needle in favor of our mission objectives”, the CTO explained. “Because of the importance of sensor fusion in the context of smart manufacturing we were able to fund the project with Amentum and Unity. Of course, it isn't too hard to see these uses of sensor fusion expanding into much broader applications such as training”.  Big data is also an issue that is driving SAI’s interest. In manufacturing “we are collecting data at a very high rate, and you can get “analysis paralysis” especially if you cannot coordinate across many different sources” Wallace explained. He continued, “we must do everything we can to enhance our situational awareness to improve our efficiency and our responsiveness to threats … whether that be an organically evolving situation or detecting an active intrusion or attack”.

Warehouse Sensors

The use case selected for the project is centered around the security of a manufacturing warehouse and is focussed on the optimisation of cameras and Lidar sensors and their position. It has been developed by Amentum in partnership with Unity Technologies and was competitively selected from among 30 proposals received by SAI.  

Simulation Drives Advancements in Sensor Fusion 2

Simulation Drives Advancements in Sensor Fusion 3

The application demonstrates how computer vision and sensor fusion of video cameras (top) and Lidar sensors (bottom) can enhance security, surveillance and safety in a warehouse environment.

Headquartered in Germantown, MD, Amentum employs more than 34,000 people working in 105 nations and provides technical and engineering services in support of the critical missions of government and commercial organizations. Dr Paul Cummings is Vice President, Transformational Training Systems at Amentum and told us “imagine developing a warehouse and you're trying to figure out where’s the best place to place sensors to detect threats as they move around in the warehouse … that's an incredibly difficult problem for humans to solve”. Unity provided a demonstration of the project where the warehouse designer can place multiple sensors and sensor types and visualize in the virtual world what the coverage areas are and their view. Further, multiple configurations and scenarios within the warehouse can be simulated, example synthetic data generated using Unity Computer Vision, and then through machine learning the automated security system can be “trained” to automatically detect security breaches and potentially monitor other aspects such as health and safety. Cummings emphasized the importance of synthetic data as “the future of accurate AI models in computer vision and sensors isn't the algorithms, it's actually the content ... and to make sure that you've got enough accurate training data.”

Simulation Drives Advancements in Sensor Fusion 4

Example of a synthetic image created from the application.

Other Applications

The technologies shown to us are not restricted to building design and operations applications, they have many other potential uses. Cummings told us as well as the warehouse “we're going to look at a naval vessel like a submarine, and then we're going to try something very different, like an outdoor environment for force on force training”. For Cummings, digital twins and Unity’s simulation tools not only enable modelling of sensors, “which they've done a lot of amazing research on”, but they generate synthetic data that can become the framework for the creation of “better computer vision models, which is extremely important”.

Simulation Drives Advancements in Sensor Fusion 5

The application simulates how computer vision models could be used to label people, forklifts and unidentified objects on a video feed.

Amentum’s work is moving from the research phase. The CTO continued “we're building a system and architecture called the weapons as a sensor platform, or WaaSP, designed for force on force training, and using optics from a camera fitted on a weapon … it is a perfect example of the way that you could take a live situation and use that in a digital environment to determine how well someone is doing in an exercise”.  Such technology has been down-selected in support of the US Army Synthetic Training Program as the only camera sensor-based live training system offering, Cummings told us.

More than Software

The simulation technology behind the project was provided by Unity, which also had the added benefit of well established toolkits for developers and researchers to train their AI algorithms in a 3D environment. Cummings told us “we particularly like Unity because of its adoption, its ease of use, and its ability to deliver across multiple platforms … and they also have a really, really impressive artificial intelligence system, as well as the sensor fusion technologies that are available and I think they are really at the top of the food chain in terms of innovation”. For the Vice President it was not only Unity’s software capabilities, “the real value of our relationship with Unity was that we were looking for a technology team that could help us to build and test these real time systems”.

This article is sponsored by Unity Technologies (https://unity.com/solutions/government-aerospace) in support of its partners. Find out more about Unity’s dedicated business and technical support for government and contractors at government@unity3d.com. Learn more about Amentum at www.amentum.com and SecureAmerica Institute at secureamerica.us.