The Sciences of Simulation

Contact Our Team

For more information about how Halldale can add value to your marketing and promotional campaigns or to discuss event exhibitor and sponsorship opportunities, contact our team to find out more

 

The America's -
holly.foster@halldale.com

Rest of World -
jeremy@halldale.com



Sciences_of_Simulation-image1-1

A primer on the visual, motion, audible and haptic cues necessary for immersive environments by David Hambling.

The Link “Blue Box”, the first effective flight trainer, was created in the late 1920s when aircraft had been flying for more than 20 years. This is a strong hint that an effective simulacrum may be harder to build than the real thing.

Our senses, sharpened by millions of years of evolution, cannot be fooled easily, making it difficult to give a realistic impression of flying high while still rooted firmly to the ground. Effective simulation requires visual, motion and audible cues, all fully integrated. Even minor flaws may result in simulator sickness, rendering the simulation useless. Other errors risk mis-training and making the simulation worse than useless. Standards for simulation are therefore necessarily high.

Seeing is Believing

Before the 1980’s, simulated flight video was created by moving a gantry-mounted camera over scale-model terrain and projecting the image onto a screen. The computers of the day simply could not handle the volume of data needed to simulate scenery in real time. Since then processing power has increased dramatically and convincing scenes can be generated in real time.

A visual display has three key characteristics: resolution, field of view and refresh rate. The demand on computing power is multiplied as these improve, outstripping even Moore’s Law which has provided ever-cheaper computers. As a result, affordability is still an issue.

Pilots can get cues for motion and distance even from simple vector graphics, but an effective display has to provide crisp, detailed images. Just as televisions have been upgraded to HD and then to 4K, simulators have higher resolution than ever. The resolution of curved screens wrapping around the viewer is expressed in terms of the angle occupied by each pixel, or arc minutes per optical line pair.

“The perfect display is one where the acuity of the display matches that of pilots, which is about two arc minutes per optical line per pair,” says Phil Perey, head of Technology for Defence & Security at CAE (Montréal, Canada). “The Holy Grail is to be able to match that in a simulator but doing that today is expensive.”

Civil simulators typically have a resolution of about four to five arc minutes per line pair, while military simulators tend to be more detailed at three to four.

Civil pilots generally only need to see and respond to comparatively large objects such as runways, so extreme resolution is not essential. However, the detail has to be good enough to convey the texture of the surface below. The landscape texture, whether forest, mountains or plains, provides a strong visual indication of altitude and speed. If the simulation does not provide this, pilots are unable to pick up the technique.

Military simulators require higher resolution because they are training for a rather different role. “Military pilots need good visual acuity to detect aircraft at a distance, and quickly identify them as friend or foe," says Perey.

For more on how the military is increasing the bar on performance, read  Measuring and Evaluating – With Greater Purpose.

This requires high-fidelity graphics to depict small objects at the edge of visibility, whether they are aircraft or vehicles on the ground. Military displays need to support, for example, crucial tasks such as distinguishing a Russian T64 tank from a US M1 Abrams from a distance.


A major challenge for incorporating VR google-based training is adding realistic haptic feedback. Image credit: HaptX.

Visual Field of View

The field of view is also more demanding for military simulations than for the civil sector. Airliner simulators require a 220-degree horizontal and 60-degree vertical field of view, but military aircraft, especially helicopters, require a greater expanse for full realism. Helicopter pilots need chin windows, side windows and overhead windows for situational awareness.

Rather than just having a few screens, some simulators project the scene onto a dome, giving a 360-degree horizontal view with 135 degrees vertical, essentially allowing the pilot to turn their head and see a realistic simulation wherever they look.

Peter Morrison, co-CEO of Bohemia Interactive Simulations (BISim) (Winter Park, Florida, US), claims this type of requirement can now better be met with virtual reality (VR), which provides full immersion at a lower cost than traditional projection systems.

Visual Refresh Rate

Video displays do not move; rather they show a succession of images rapidly to give the illusion of movement, using the same principle as the frames going through an old-time movie projector. In the 1990s, flight simulators typically updated at 30 Hertz, ie, 30 times a second. This was not ideal, a limitation of the available computing power, but workable. The standard is now 60 Hertz, the same as television.

However, larger screens with higher resolutions require higher frame rates, otherwise images start to look jerky. The problem occurs especially with fast-changing scenes such as flight close to the ground or rapid yawing movements in a helicopter. In such situations the number of pixels that an object shifts each frame makes the movement uneven, so higher refresh rates are needed.

“A number of scientific studies say that as resolution improves, you have to improve the refresh rate,” says Perey. “The new standard is 120 Hertz for helicopters and fast jets.”

Virtual Worlds

Even with ideal resolution and refresh rate and an immersive field of view, a display is only as good as what it shows. In real life, the view from a cockpit is not as clear as a video game.

“When you’re at 40,000 feet, haze gets in the way,” says Morrison.

The effect, known as aerial perspective, makes distant objects merge with the background haze, lose contrast and detail and take on a blueish tinge. The effect varies depending on atmospheric conditions and the direction of the light, placing significant demands on processing.

As far as ground detail goes, civil simulations focus on airports, ensuring that the depictions are accurate and up-to-date. There is less interest in what lies between landing sites.

By contrast, military pilots need to have realistic objects on the ground everywhere. In low-level flight, houses, roads and trees provide a reference for height, the rate of altitude change and the angle at which the aircraft is approaching the ground. They must be accurately depicted for pilots to develop judgement, otherwise they may either rely too much on instruments or mis-read cues in actual flight.

Modern military simulations do not just show fixed features, but a complete dynamic environment. Morrison says their simulations include a ‘pattern of life’ with ships, road traffic and even pedestrians visible from close range.

It is increasingly important for military trainers to depict specific parts of the world accurately rather than generic terrain. Perey gives the example of a recent Royal Canadian Air Force exercise pre-training Chinook pilots for relief operations in Mali. Accurate simulation of conditions on the ground allowed pilots to get valuable experience before they were deployed on the mission, so that by the time they were deployed they were prepared for the conditions they would encounter.

The ability to provide detailed, accurate recreations of real places is a key area of competition for military simulators.

To see how the U.S. Air Force is incorporating LVC for air combat training, read  True Blended Air Combat LVC Training.

Sound Effects

Sound provides pilots with important situational awareness. Much of this is simply confirmatory; increasing thrust brings a corresponding change in the engine sound, and this heightens the immersive effect. Engine sound may also alert pilots to problems before they show up on the instruments.

Our two ears give us a natural talent for localization, detecting the source of a sound. Commercial sound technology has been adapted to equip simulators with an array of speakers to give the impression of sound coming from any direction. Different size speakers produce sounds which cover the full range of human hearing.


Swiss startup Somniacs created the "Birdly" VR simulator with flappable "wings". Image credit: Somniacs.

A Moving Experience

Visual effects alone can provide the sensation of motion to a surprising degree. Vection is the technical term for the feeling of movement in a stationary train when watching an adjacent train start to move. This type of cross-sensory cueing means we ‘feel’ movement which is not actually there.

Martijn de Mooij, technical development manager at Cruden, based in Amsterdam, The Netherlands, says that vection can be so strong that in a bridge simulator replicating a large vessel, people can fall over as a result of the non-existent movement.

Full-motion platforms provide a more complete sensation of motion. They are designed to fool our senses, in particular the vestibular system in the inner ear. This type of motion simulator is known as a Stewart Platform and has six jacks to deliver six degrees of freedom - three rotational (yaw, pitch and roll), plus three translational - heave (vertical), sway (side to side) and surge (forward or back).

Some motion sensing is through the skin, hence the expression of flying by the “seat of the pants” rather than with instruments. Tilt co-ordination can fool this sense. Tilting the platform backwards but keeping the visual horizon level is interpreted by the brain as forward acceleration rather than tilting. This technique, creating the illusion of acceleration with gravity, may not be convincing for users who are particularly sensitive to acceleration.

The challenge is to produce a sensation of movement – or rather of acceleration, as continuous straight-line movement cannot be sensed – on a platform with limited travel. The standard approach is to filter out the long, low-frequency movements and only replicate the more rapid accelerations which can be reproduced by a simulator.

This works well for training, because our ability to sense prolonged acceleration is unreliable. During a long bank manoeuvre, pilots may lose the sensation of banking altogether, so when they straighten out it feels like they are banking in the other direction. There is limited value in simulating something which pilots are trained to ignore.

Cruden go one step further in their motion cueing, eliminating movement which is irrelevant but reproducing the aspects which are vital. In a driving simulator, it may be the sensation of the back wheels starting to slip.

“It’s not about trying to replicate all movement of the vehicle as accurately as possible,” says de Mooij, “It’s about giving the drivers only those bits of vehicle movement that they actually need.”

As vection shows, the visual sense can override the inner ear. After the initial movement, the brain pays attention to visual input and blanks out the vestibular sense. Motion simulators can take advantage of this.

“When making a turn we can provide an onset of the turn-in motion, which is sensed very quickly by your brain, and then we render the turning as you carry around the corner on the screen, and your brain's visual processing takes over and says, ‘Yes, I'm going round a corner,’“ says Kia Cammaerts of Ansible Motion, Norwich, UK. “We can quietly remove that motion signal without you noticing.”

This technique can provide the sensation of far more movement than the driver would otherwise perceive.


Image generators such as VBS Blue IG can render the whole earth down to blades of grass. Image credit: BISim.

Some techniques do not rely on a full-motion platform. de Mooij says for sharp braking maneuvers they tighten the seat belt, giving a very effective sensation of being thrown forward.

Many military simulators do not have full motion, as they are more concerned with mission training which is more about interactions and the bigger picture of an operation rather than the minutiae of flying. The value of full motion for military simulators remains a matter of debate.

When Simulators Go Bad

Simulator sickness is sometimes a serious issue for military simulations, even though there is no strict medical description or even a good definition. It is a close cousin of motion sickness and seasickness, which are also poorly understood. While elusive in some ways, simulator sickness is all too easy to induce. The most obvious symptom is nausea, but simulator sickness can also involve blurred vision, vertigo, dizziness, fatigue and headache. It may persist after leaving the simulation, usually for only a few minutes but sometimes for several hours.

The basis of this type of ailment is disputed. The Evolutionary Theory suggests that we have not adapted to modern means of transportation. Sensory Conflict Theory holds that the difference between senses – an inconsistency between what we see and what we feel, for example – gives rise to the condition. The Postural Instability Theory is that the sickness arises from our inability to control our body position normally. The truth may be a combination of these.

Research into eradicating simulator sickness has been empirical. Rather than seeking to understand exact mechanisms, the focus has been in preventing symptoms. Developers found that any delay between motion and vision provoked simulator sickness – strong support for the sensory conflict theory – and when latency was reduced with more powerful processing it ceased to be a major issue for a time.

The problem has arisen again in recent years because the more immersive a simulator, the more likely it is to cause sickness, most obviously in VR environments. Virtual reality headsets can cause sickness if there is a time delay of more than about 50 milliseconds between the wearer moving their head and the display responding.

“To prevent simulator sickness, you have to tighten up visual and motion cues – you need to make them lockstep,” says Morrison.


USAF Pilot Training Next students train on a virtual reality flight simulator at the Armed Forces Reserve Center in Austin, Texas. Image credit: US Air Force/Sean M. Worrell.

Although higher resolution requires higher refresh rates to look realistic, it turns out that lower refresh rates also contribute to simulator sickness. “Simulator sickness will occur when the frame rate starts to drop below about 90 a second,” says Morrison.

The simple, if expensive solution, is faster processing with less latency and higher frame rates. In practice though, it may also be possible to reduce simulator sickness by reducing exposure. Perey notes that co-pilots are more likely to suffer, apparently because they have less control. But their training may not require an immersive environment, so in this case the problem can be solved by not putting co-pilots in VR headsets.

Combatting simulator sickness remains an essential, if largely empirical, process.



Touching Reality - How Haptics Can Enable Virtual Cockpits

The tactile aspect of simulations is usually provided by hardware. Current simulator cockpits are faithful reproductions of the real thing, down to every last switch and button.

Virtual reality offers a new paradigm. It has the potential to recreate a complete environment, not just visual and auditory but tactile as well. In theory a pilot could don a VR rig and step into a perfectly recreated simulation of any aircraft cockpit and have the full tactile experience of operating controls. There should be no need for a separate simulator for each aircraft type, or indeed any external hardware.

“Current VR gloves aren’t quite there yet,” says Peter Morrison of BISim. “You can’t touch and grab components.”

Morrison says their VR setup is ‘mixed-reality’, what others might term augmented reality. The visual aspect of the simulation is provided by a VR display, but the operator sits in a cockpit with physical controls corresponding to the aircraft being simulated.

“Once we get better VR gloves, we will certainly be using them,” says Morrison.

Providing the right sort of tactile feedback is the grail of the VR glove world. The current standard is for gloves to be fitted with vibrating motors to provide a sense of contact. This is another area where cross-sensory cueing comes into play, when combined with the visual VR environment, prompting the wearer to interpret the vibration as the touch of a solid object even though there is no resistance. Being able to feel objects in a virtual world by touch is impressive, but the sensation is not entirely convincing. Companies like Oculus are working on VR gloves with ‘tendons’ built in to add resistance.

French startup GoTouch VR (Villeneuve-d’Ascq, near Lille) use fingertip technology based on skin indentation. This leverages the illusion that occurs when you run your fingers along the teeth of a comb and appear to feel a moving bump because of how the skin is deformed. This technology can provide the sensation of detailed small objects, and specifically touching of buttons. In 2018, GoTouch teamed up with US company FlyInside (New York) to demonstrate a haptic demonstrator for pilot training at the Eurosatory defence conference. Their VRTech technology gloves give the sensation of a click when pushing a button or flipping a switch in the simulated environment. US military customers are said to be testing the new technology.

Perhaps the most advanced VR gloves are those developed by US company HaptX (Seattle). “The gloves give shape, size, weight and even texture to virtual objects,” says Joe Michaels, chief revenue officer (aka business development).

This is achieved by microfluidic skin: a flexible, silicone textile containing pneumatic actuators and microchannels. Each glove contains 130 such actuators which push against the wearer’s skin, providing the sensation of touching an object. In addition, an exoskeleton on the back of each hand can provide up to four pounds of resistance. These two together give complete tactile feedback for virtual objects.

“You can take a 3D model of any aircraft and you can make it touchable very easily using HaptX technology,” says Michaels. “You can make each button and switch realistic.”

The next challenge will be replacing the yoke and throttle. Gloves alone cannot recreate the sensation in the arms and shoulders of resting on an object. Achieving this will require a full-body VR exoskeleton, something which HaptX are already working on. In the near term there will also be VR boots to operate pedals and foot controls.

Michaels says there has already been tremendous military interest and support for their work. The military community saw the potential for this kind of development decades ago, and the technology is finally catching up with the vision. HaptX founder Jake Rubin envisaged the day when VR could simulate any environment, including the controls of any past, present or future vehicle. That day is now within sight, though Michaels will not be pinned down to a year.

“It will be soon,” says Michaels. “Sooner than you think.”

About the Author

UK-based David Hambling is science writer for The Guardian, New Scientist, Wired, Popular Mechanics and other publications. He also writes mystery/horror fiction.

Originally published in Issue 3, 2019 of MS&T Magazine.

Featured

More events

Related articles



More Features

More features