What’s a solution to more congested airspace and pilot information overload? How about simulation in the cockpit? Rick Adams reports.

Simulation for commercial aviation seems to be coming full circle. The original objective of modeling and simulation was to replicate the real world as faithfully as possible to provide a highly realistic training environment, and the technology evolution of the past 20 years, particularly of visual systems, has effectively accomplished that. Now the challenge is shifting to augmentation of the real world in the aircraft cockpit with selected artificial elements of simulation – almost beyond reality, in a sense – in order to enable more precise operations, especially in less than ideal flying conditions.

“Air transport operators have to be cognizant of the airspace demands. There are more airplanes, more traffic. There’s also an explosion of information and capabilities utilizable by the crew,” explains Craig Peterson, Rockwell Collins director of avionics and flight control marketing.

New airspace management technologies such as reduced vertical separation minimums (RVSM), automatic dependent surveillance broadcast (ADS-B), wide area augmentation system (WAAS) and many others are creating new comprehension pressures, both in the air and in the training classroom.

“As the airspace evolves,” Peterson says, “especially in crowded sectors such as the US East Coast and Europe, it creates command and control issues of segregation and separation, requiring greater situational awareness.”

The Federal Aviation Administration’s annual forecast predicts traffic volume for US air carriers will rise by more than 75% over the next two decades (an annual average of 2.2% a year, down from last year’s estimate of 2.6%). “The aviation industry continues to show resilience, even during difficult economic times,” notes FAA Administrator Michael Huerta.

Other regions of the world are expected to be much more robust. Latin America at 4.7%, buoyed by Brazil at 6.1%, the Asia Pacific region at 4.3% and the cross-Atlantic ‘Open Skies’ market by 4.1%.

But even those numbers are modest compared with the advent of drones in the latter half of this decade. The FAA forecasts that up to 10,000 commercial unmanned aircraft systems (UASs) could be in the skies by 2020, once they issue regulations on civil UAS operations by 2015, as requested by the US Congress.

“UASs represent the same kind of airspace management threat,” says Peterson. “You need to know their position and intent, and they need to be managed for segregation and separation just as with a piloted vehicle.”

Synthetic, Enhanced, or Both?

Two of the solutions available for pilots to better manage their position and intent in the airspace are synthetic vision and enhanced vision.

Synthetic vision systems (SVS), in essence, are similar to the computer-generated 3D world which pilots use to train in a flight simulator, and may be derived from the same databases. The system graphically represents the real-world terrain, bodies of water, roads, buildings and other obstacles, airports and runways, and is presented via a primary flight display (PFD) or Head Up Display (HUD). In addition, an artificial horizon, heading, attitude, altitude and other indicators can be overlaid on the terrain.

The depiction of the world in front of the windscreen is not necessarily photorealistic, as in the simulator. Rather, in the cockpit, elements of un-reality can be introduced in order to deliver critical information to the pilots. For example, terrain heights may be characterized by different colored banding. Highway In The Sky (HITS) projected path boxes can be presented as well. And the synthetic world displayed is always clear and bright, unobscured regardless of the real-world weather.

Enhanced vision systems (EVS) share the artificiality of SVS, but instead of a pre-packaged database, EVS presents real-time imagery generated by onboard sensors (typically infrared or radar). Because the sensor can “see through” low visibility haze or night conditions, a CAT I approach be can used under CAT II conditions, for example.

The first synthetic vision system to be FAA-certified, in 2009, was Gulfstream’s PlaneView flight deck, developed with Honeywell. SVS systems are now available as original equipment or retrofit from Rockwell Collins, Garmin, Universal Avionics, Cobham and others.

Application has been largely confined to business aircraft and helicopters, though the corporate aircraft versions of the Boeing 737 and Airbus A320 have been equipped with synthetic vision. Rockwell Collins’ latest integrated cockpit with SVS, known as Pro Line Fusion, is being installed on new regional jets such the Bombardier CSeries, Mitsubishi MRJ, and China’s ARJ21.

“The air transport world is behind in terms of embracing this technology, but that is changing,” says Peterson. Rockwell is planning to add synthetic vision capability to a Boeing 757/767 cockpit retrofit by around 2016. The Airbus A380 was the first aircraft to feature an EASA-approved head-up display (HUD) with SVS capability.

Cargo operator FedEx and other early adopters such as Alaska Airlines and Southwest Airlines are also using various combinations of Rockwell Collins’ SVS and EVS, according to Peterson. Air Iceland’s recent Dash 8-Q200 cockpit upgrade incorporates a Universal Avionics Vision I SVS. Honeywell’s Primus Epic integrated cockpit, which includes synthetic vision, has been chosen by Embraer for its second-generation E-Jets family.

Garmin’s G1000 integrated cockpit with their Synthetic Vision Technology (SVT) has also made its way into the trainer market on Piper’s Archer and Seminole aircraft.

The A380 also has an enhanced vision capability, which integrates imagery from forward-looking infrared (FLIR), millimetre-wave radiometry, milimetre-wave radar and/or low-light image intensification.

Gulfstream’s EVS, using FLIR technology developed with Kollsman, is said to be tuned to the sensitivity of runway light frequencies. Dassault’s Falcon 7X uses a “second generation” EVS from CMC Electronics.

Peterson says the EVS capability can be used for such situations as “a black hole approach at midnight in Aspen, Colorado” or for airfields with minimalist instrument landing systems (ILS) such as in China or India, “which don’t have the airport lighting environment fidelity we see in the US and Europe.”

Supporting these new graphics-driven capabilities in the cockpit are ever-larger displays. “Displays are larger and have much more processing horsepower and connectivity,” Peterson describes. “They enable information-rich applications.”

Compared with the previous generation electronic flight instrument system (EFIS) displays, which were between four and six inches wide, today’s “glass cockpit” display real estate can include multiple 15-inch touch-screens as in the Boeing 787.

Rockwell positions its SVS and EVS on the HUD, preferring the “eyes up” approach, whereas Honeywell and others locate them on the head-down displays.

Saving Time, Fuel, Cost

Ultimately, the new cockpit systems and the airspace management systems they link with are about saving time and fuel, both of which translate to saving cost. “We are moving from an aviation system of ground-based navigation aids to the satellite-based system of tomorrow,” states the FAA’s Huerta. “This will help us move more air traffic efficiently, while reducing flight times and emissions. We are already seeing the benefits around the country.”

“Increased use of performance-based navigation will give aircraft more freedom in the sky to choose more direct and fuel-efficient routes.”

A simple go-around due to marginal weather can cost an aircraft operator about $1,000. A diversion to a different airport can cost $3,000 or more, and that does not include airport fees, passenger re-routing and cargo delivery delays.

A side benefit of the new advanced flight deck technologies is that training should be naturally more intuitive for the next generation of pilots, who have been weaned on interactive mobile applications for video games, smart phones and tablet computers.

“Younger pilots are used to graphic interfaces, touch screens, lots of interactivity, ease of adaptation,” says Rockwell Collins’ Peterson. “Coming from the consumer mobile-based technology, they have the same level of expectation in the aircraft in terms of simplicity, the amount of information and ease of interface.”

Developing the Synthetic Cockpit Standards

The RTCA (Radio Technical Commission for Aeronautics), originally organized in 1935, has a Special Committee, SC-213, working with Eurocae, a non-profit European forum for resolving air transport technical issues, on developing minimum aviation safety performance standards (MASPS) for synthetic vision, enhanced vision and combined systems. The committee’s most recent publication update was issued in June 2011.

SAE International has a parallel group, G-10, which has been looking at SVS and EVS primarily for rotorcraft operations.

One of the expert findings is that “visual dominance” is the most important spatial orientation information for flight. In reviewed accidents, “no discernible visual horizon” was a common element. Therefore, if visual dominance is lost, vestibular (inner ear balance) may take over, leading to loss of spatial awareness, leading to loss of control.

Such a finding is leading the working groups to analyze how to most effectively present key information to the pilots to create and maintain visual dominance. They are asking such questions as what field of view is required and which information and graphic features should be included or excluded from the display.

The ultimate goal will be regulatory changes which will allow operators to use synthetic and/or enhanced vision technologies to lower decision height on instrument approaches, perhaps to as low as 100 feet.